Oct 10 06:51:11 crc systemd[1]: Starting Kubernetes Kubelet... Oct 10 06:51:11 crc restorecon[4700]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:11 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 10 06:51:12 crc restorecon[4700]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 10 06:51:13 crc kubenswrapper[4732]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:51:13 crc kubenswrapper[4732]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 10 06:51:13 crc kubenswrapper[4732]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:51:13 crc kubenswrapper[4732]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:51:13 crc kubenswrapper[4732]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 10 06:51:13 crc kubenswrapper[4732]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.387604 4732 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393182 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393213 4732 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393222 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393231 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393238 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393245 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393251 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393256 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393262 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393269 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393276 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393284 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393290 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393297 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393304 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393314 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393335 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393342 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393349 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393355 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393362 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393369 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393376 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393382 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393388 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393395 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393402 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393408 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393414 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393420 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393426 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393432 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393439 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393445 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393452 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393459 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393465 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393472 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393480 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393515 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393521 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393527 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393533 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393540 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393546 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393554 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393560 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393566 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393573 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393579 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393585 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393592 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393599 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393608 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393616 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393624 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393629 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393636 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393642 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393648 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393653 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393658 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393663 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393670 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393675 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393679 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393684 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393689 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393694 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393720 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.393725 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395207 4732 flags.go:64] FLAG: --address="0.0.0.0" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395226 4732 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395239 4732 flags.go:64] FLAG: --anonymous-auth="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395249 4732 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395257 4732 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395263 4732 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395272 4732 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395279 4732 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395286 4732 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395292 4732 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395299 4732 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395305 4732 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395311 4732 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395317 4732 flags.go:64] FLAG: --cgroup-root="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395323 4732 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395329 4732 flags.go:64] FLAG: --client-ca-file="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395335 4732 flags.go:64] FLAG: --cloud-config="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395341 4732 flags.go:64] FLAG: --cloud-provider="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395348 4732 flags.go:64] FLAG: --cluster-dns="[]" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395356 4732 flags.go:64] FLAG: --cluster-domain="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395363 4732 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395369 4732 flags.go:64] FLAG: --config-dir="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395375 4732 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395382 4732 flags.go:64] FLAG: --container-log-max-files="5" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395390 4732 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395396 4732 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395402 4732 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395408 4732 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395414 4732 flags.go:64] FLAG: --contention-profiling="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395420 4732 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395426 4732 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395432 4732 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395438 4732 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395446 4732 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395452 4732 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395458 4732 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395464 4732 flags.go:64] FLAG: --enable-load-reader="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395469 4732 flags.go:64] FLAG: --enable-server="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395475 4732 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395483 4732 flags.go:64] FLAG: --event-burst="100" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395490 4732 flags.go:64] FLAG: --event-qps="50" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395496 4732 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395502 4732 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395507 4732 flags.go:64] FLAG: --eviction-hard="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395515 4732 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395522 4732 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395528 4732 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395534 4732 flags.go:64] FLAG: --eviction-soft="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395541 4732 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395547 4732 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395553 4732 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395559 4732 flags.go:64] FLAG: --experimental-mounter-path="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395565 4732 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395571 4732 flags.go:64] FLAG: --fail-swap-on="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395577 4732 flags.go:64] FLAG: --feature-gates="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395585 4732 flags.go:64] FLAG: --file-check-frequency="20s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395591 4732 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395597 4732 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395603 4732 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395608 4732 flags.go:64] FLAG: --healthz-port="10248" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395615 4732 flags.go:64] FLAG: --help="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395621 4732 flags.go:64] FLAG: --hostname-override="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395626 4732 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395632 4732 flags.go:64] FLAG: --http-check-frequency="20s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395638 4732 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395643 4732 flags.go:64] FLAG: --image-credential-provider-config="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395649 4732 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395655 4732 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395661 4732 flags.go:64] FLAG: --image-service-endpoint="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395666 4732 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395672 4732 flags.go:64] FLAG: --kube-api-burst="100" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395678 4732 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395684 4732 flags.go:64] FLAG: --kube-api-qps="50" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395694 4732 flags.go:64] FLAG: --kube-reserved="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395717 4732 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395723 4732 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395729 4732 flags.go:64] FLAG: --kubelet-cgroups="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395735 4732 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395741 4732 flags.go:64] FLAG: --lock-file="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395746 4732 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395753 4732 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395759 4732 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395768 4732 flags.go:64] FLAG: --log-json-split-stream="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395774 4732 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395780 4732 flags.go:64] FLAG: --log-text-split-stream="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395785 4732 flags.go:64] FLAG: --logging-format="text" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395791 4732 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395797 4732 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395802 4732 flags.go:64] FLAG: --manifest-url="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395814 4732 flags.go:64] FLAG: --manifest-url-header="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395822 4732 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395827 4732 flags.go:64] FLAG: --max-open-files="1000000" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395835 4732 flags.go:64] FLAG: --max-pods="110" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395841 4732 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395846 4732 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395852 4732 flags.go:64] FLAG: --memory-manager-policy="None" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395858 4732 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395863 4732 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395869 4732 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395874 4732 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395887 4732 flags.go:64] FLAG: --node-status-max-images="50" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395893 4732 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395898 4732 flags.go:64] FLAG: --oom-score-adj="-999" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395904 4732 flags.go:64] FLAG: --pod-cidr="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395910 4732 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395917 4732 flags.go:64] FLAG: --pod-manifest-path="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395923 4732 flags.go:64] FLAG: --pod-max-pids="-1" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395929 4732 flags.go:64] FLAG: --pods-per-core="0" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395935 4732 flags.go:64] FLAG: --port="10250" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395941 4732 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395947 4732 flags.go:64] FLAG: --provider-id="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395953 4732 flags.go:64] FLAG: --qos-reserved="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395959 4732 flags.go:64] FLAG: --read-only-port="10255" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395965 4732 flags.go:64] FLAG: --register-node="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395971 4732 flags.go:64] FLAG: --register-schedulable="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395976 4732 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395985 4732 flags.go:64] FLAG: --registry-burst="10" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395991 4732 flags.go:64] FLAG: --registry-qps="5" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.395997 4732 flags.go:64] FLAG: --reserved-cpus="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396002 4732 flags.go:64] FLAG: --reserved-memory="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396010 4732 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396016 4732 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396021 4732 flags.go:64] FLAG: --rotate-certificates="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396027 4732 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396033 4732 flags.go:64] FLAG: --runonce="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396039 4732 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396045 4732 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396051 4732 flags.go:64] FLAG: --seccomp-default="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396057 4732 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396063 4732 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396069 4732 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396075 4732 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396081 4732 flags.go:64] FLAG: --storage-driver-password="root" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396087 4732 flags.go:64] FLAG: --storage-driver-secure="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396093 4732 flags.go:64] FLAG: --storage-driver-table="stats" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396098 4732 flags.go:64] FLAG: --storage-driver-user="root" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396104 4732 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396110 4732 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396116 4732 flags.go:64] FLAG: --system-cgroups="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396121 4732 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396130 4732 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396135 4732 flags.go:64] FLAG: --tls-cert-file="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396141 4732 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396148 4732 flags.go:64] FLAG: --tls-min-version="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396153 4732 flags.go:64] FLAG: --tls-private-key-file="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396159 4732 flags.go:64] FLAG: --topology-manager-policy="none" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396164 4732 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396170 4732 flags.go:64] FLAG: --topology-manager-scope="container" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396175 4732 flags.go:64] FLAG: --v="2" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396183 4732 flags.go:64] FLAG: --version="false" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396191 4732 flags.go:64] FLAG: --vmodule="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396197 4732 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396203 4732 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396337 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396344 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396350 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396355 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396360 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396365 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396370 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396375 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396380 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396386 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396391 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396396 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396401 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396406 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396411 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396417 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396423 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396430 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396435 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396440 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396445 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396450 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396456 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396461 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396470 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396475 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396480 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396485 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396490 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396495 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396500 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396504 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396509 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396514 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396519 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396524 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396529 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396533 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396538 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396543 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396548 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396553 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396557 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396562 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396568 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396573 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396579 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396585 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396591 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396596 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396601 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396606 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396610 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396616 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396622 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396629 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396636 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396641 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396647 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396652 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396658 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396663 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396668 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396673 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396679 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396685 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396694 4732 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396726 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396732 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396737 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.396743 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.396759 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.412176 4732 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.412563 4732 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412775 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412792 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412802 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412815 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412825 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412837 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412849 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412860 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412872 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412916 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412927 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412936 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412944 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412954 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412967 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412979 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412989 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.412998 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413009 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413021 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413035 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413046 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413055 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413066 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413075 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413083 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413092 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413100 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413109 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413117 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413126 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413135 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413143 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413153 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413162 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413171 4732 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413182 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413193 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413203 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413216 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413227 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413239 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413250 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413261 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413274 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413285 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413296 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413307 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413318 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413331 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413346 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413357 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413368 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413379 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413391 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413403 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413413 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413424 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413435 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413446 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413456 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413471 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413484 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413495 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413506 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413517 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413528 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413539 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413549 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413561 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.413572 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.413591 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414060 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414084 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414098 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414113 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414127 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414139 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414152 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414166 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414178 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414194 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414209 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414222 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414236 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414251 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414267 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414285 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414298 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414309 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414324 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414336 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414350 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414364 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414377 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414388 4732 feature_gate.go:330] unrecognized feature gate: Example Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414399 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414410 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414422 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414433 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414446 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414462 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414477 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414488 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414498 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414510 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414521 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414533 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414544 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414559 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414571 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414583 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414595 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414606 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414617 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414628 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414639 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414651 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414663 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414675 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414686 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414746 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414758 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414770 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414781 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414792 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414803 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414814 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414825 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414836 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414847 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414859 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414871 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414883 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414895 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414906 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414917 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414927 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414938 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414950 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414961 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414971 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.414983 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.414999 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.416226 4732 server.go:940] "Client rotation is on, will bootstrap in background" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.424651 4732 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.424891 4732 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.427083 4732 server.go:997] "Starting client certificate rotation" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.427137 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.427517 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-21 05:28:45.656042976 +0000 UTC Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.427772 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1006h37m32.228281669s for next certificate rotation Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.457653 4732 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.461153 4732 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.479629 4732 log.go:25] "Validated CRI v1 runtime API" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.521268 4732 log.go:25] "Validated CRI v1 image API" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.524757 4732 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.532997 4732 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-10-06-46-28-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.533059 4732 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.567310 4732 manager.go:217] Machine: {Timestamp:2025-10-10 06:51:13.563471875 +0000 UTC m=+0.633063216 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f97cf68a-a91c-438d-bef2-b95519e23c5d BootID:677988c9-53ea-44ee-b7e0-55b4b6597681 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:bb:a4:d3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:bb:a4:d3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:da:30:c4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:53:f0:ee Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:20:4c:6f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:9a:5f:6a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:8c:c3:34 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:67:fa:d4:9d:36 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:cb:dc:78:07:95 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.567844 4732 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.568063 4732 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.569829 4732 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.570420 4732 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.570496 4732 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.570981 4732 topology_manager.go:138] "Creating topology manager with none policy" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.571002 4732 container_manager_linux.go:303] "Creating device plugin manager" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.571789 4732 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.571852 4732 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.572155 4732 state_mem.go:36] "Initialized new in-memory state store" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.572308 4732 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.576187 4732 kubelet.go:418] "Attempting to sync node with API server" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.576235 4732 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.576288 4732 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.576318 4732 kubelet.go:324] "Adding apiserver pod source" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.576344 4732 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.584049 4732 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.585244 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.585279 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.585487 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.585387 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.585593 4732 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.588308 4732 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.589840 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.589893 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.589909 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.589923 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.589948 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.589962 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.589977 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.590001 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.590017 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.590035 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.590083 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.590120 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.592335 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.593102 4732 server.go:1280] "Started kubelet" Oct 10 06:51:13 crc systemd[1]: Started Kubernetes Kubelet. Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.594607 4732 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.594732 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.594607 4732 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.595575 4732 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.596289 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.596342 4732 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.596511 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 06:38:57.664980959 +0000 UTC Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.596595 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1703h47m44.06839187s for next certificate rotation Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.596964 4732 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.596989 4732 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.597192 4732 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.596957 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.598818 4732 factory.go:55] Registering systemd factory Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.598840 4732 factory.go:221] Registration of the systemd container factory successfully Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.603511 4732 server.go:460] "Adding debug handlers to kubelet server" Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.603950 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.604133 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.604795 4732 factory.go:153] Registering CRI-O factory Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.604839 4732 factory.go:221] Registration of the crio container factory successfully Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.604962 4732 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.604998 4732 factory.go:103] Registering Raw factory Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.605027 4732 manager.go:1196] Started watching for new ooms in manager Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.604525 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="200ms" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.605854 4732 manager.go:319] Starting recovery of all containers Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.604340 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.246:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186d0e97961cd8da default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-10 06:51:13.59305545 +0000 UTC m=+0.662646731,LastTimestamp:2025-10-10 06:51:13.59305545 +0000 UTC m=+0.662646731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617074 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617148 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617171 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617190 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617209 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617227 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617244 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617263 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617283 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617299 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617326 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617344 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617365 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617387 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617406 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617427 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617444 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617462 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617480 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617496 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617515 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617532 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617552 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617570 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617594 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617611 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617635 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617655 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617674 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617737 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617757 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617776 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617793 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617810 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617826 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617844 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617861 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617879 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617898 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617915 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617932 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617951 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617972 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.617997 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618023 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618046 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618070 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618098 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618116 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618133 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618149 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618166 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618192 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618211 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618229 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618248 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618267 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618284 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618305 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618323 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618340 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618385 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618402 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618421 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618437 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618456 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618472 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618490 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618507 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618526 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618546 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618563 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618580 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618597 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618614 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618631 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618648 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618667 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618684 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618736 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618755 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618772 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618790 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618807 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618824 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618843 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618861 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618879 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.618897 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622095 4732 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622191 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622222 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622240 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622260 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622278 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622297 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622315 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622333 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622350 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622368 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622388 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622407 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622424 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622442 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622461 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622570 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622593 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622613 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622632 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622652 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622671 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622814 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622836 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622856 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622875 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622895 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622914 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622933 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622951 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622970 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.622988 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623010 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623028 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623046 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623065 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623083 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623101 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623121 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623140 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623158 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623177 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623197 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623214 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623244 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623262 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623280 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623300 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623318 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623335 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623352 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623370 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623387 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623407 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623425 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623443 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623460 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623479 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623497 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623516 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623534 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623552 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623570 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623586 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623604 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623624 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623643 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623660 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623739 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623760 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623780 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623798 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623815 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623834 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623851 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623869 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623889 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623907 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623926 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623945 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623963 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.623980 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624000 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624018 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624035 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624053 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624082 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624099 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624115 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624132 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624150 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624168 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624186 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624203 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624220 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624236 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624255 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624272 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624289 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624306 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624325 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624343 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624362 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624379 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624398 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624415 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624433 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624449 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624467 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624483 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624555 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624582 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624601 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624620 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624637 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624655 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624672 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624720 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624741 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624758 4732 reconstruct.go:97] "Volume reconstruction finished" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.624772 4732 reconciler.go:26] "Reconciler: start to sync state" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.639784 4732 manager.go:324] Recovery completed Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.656913 4732 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.658862 4732 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.658913 4732 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.658946 4732 kubelet.go:2335] "Starting kubelet main sync loop" Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.659014 4732 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.660222 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: W1010 06:51:13.661218 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.661452 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.661912 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.661966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.661984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.663041 4732 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.663070 4732 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.663094 4732 state_mem.go:36] "Initialized new in-memory state store" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.687670 4732 policy_none.go:49] "None policy: Start" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.689055 4732 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.689083 4732 state_mem.go:35] "Initializing new in-memory state store" Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.697298 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.756342 4732 manager.go:334] "Starting Device Plugin manager" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.756618 4732 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.756649 4732 server.go:79] "Starting device plugin registration server" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.757222 4732 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.757250 4732 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.757825 4732 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.757944 4732 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.757956 4732 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.759084 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.759184 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.761094 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.761127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.761138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.761256 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.761616 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.761695 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.762311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.762390 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.762400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.762500 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.762681 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.762755 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.763081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.763236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.763355 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.763973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.764628 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.765391 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.765686 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.765733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.765744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.765857 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766025 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766072 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766690 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766725 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766738 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766824 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766831 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766961 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.766988 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767472 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767624 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767646 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.767652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.768201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.768234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.768246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.768583 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.806306 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="400ms" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827619 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827682 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827771 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827898 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827930 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827960 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.827989 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.828028 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.828069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.828106 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.828143 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.828174 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.828234 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.828263 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.858199 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.859808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.859851 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.859866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.859945 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:51:13 crc kubenswrapper[4732]: E1010 06:51:13.860504 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.929912 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.929979 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930061 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930096 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930126 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930153 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930162 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930230 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930242 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930273 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930327 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930181 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930343 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930363 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930397 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930431 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930466 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930498 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930539 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930545 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930584 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930630 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930631 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930674 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930772 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930784 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:13 crc kubenswrapper[4732]: I1010 06:51:13.930729 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.060914 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.062839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.062895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.062945 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.062992 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:51:14 crc kubenswrapper[4732]: E1010 06:51:14.063759 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.111779 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.124589 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.150846 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:14 crc kubenswrapper[4732]: W1010 06:51:14.170495 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6b6555ca16b575fd8caf415f4e73a090fb94512ceb9a68f85b9815b3b5dc8c5f WatchSource:0}: Error finding container 6b6555ca16b575fd8caf415f4e73a090fb94512ceb9a68f85b9815b3b5dc8c5f: Status 404 returned error can't find the container with id 6b6555ca16b575fd8caf415f4e73a090fb94512ceb9a68f85b9815b3b5dc8c5f Oct 10 06:51:14 crc kubenswrapper[4732]: W1010 06:51:14.171602 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-eb509023b0ba357388df3402a906d17a88aeb70106ae5eb2eee30639dd77446e WatchSource:0}: Error finding container eb509023b0ba357388df3402a906d17a88aeb70106ae5eb2eee30639dd77446e: Status 404 returned error can't find the container with id eb509023b0ba357388df3402a906d17a88aeb70106ae5eb2eee30639dd77446e Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.173300 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:14 crc kubenswrapper[4732]: W1010 06:51:14.177816 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e2375e23956c0e8a45070b441f4f359be370a378a66e7c821560931d8ee3ec51 WatchSource:0}: Error finding container e2375e23956c0e8a45070b441f4f359be370a378a66e7c821560931d8ee3ec51: Status 404 returned error can't find the container with id e2375e23956c0e8a45070b441f4f359be370a378a66e7c821560931d8ee3ec51 Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.181737 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:14 crc kubenswrapper[4732]: W1010 06:51:14.202400 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-203163b14076f73863af58a6ddb58a582bf314f66369738b226e7171f4fcdfba WatchSource:0}: Error finding container 203163b14076f73863af58a6ddb58a582bf314f66369738b226e7171f4fcdfba: Status 404 returned error can't find the container with id 203163b14076f73863af58a6ddb58a582bf314f66369738b226e7171f4fcdfba Oct 10 06:51:14 crc kubenswrapper[4732]: E1010 06:51:14.207319 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="800ms" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.464613 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.466096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.466148 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.466159 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.466192 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:51:14 crc kubenswrapper[4732]: E1010 06:51:14.466769 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 10 06:51:14 crc kubenswrapper[4732]: W1010 06:51:14.526438 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:14 crc kubenswrapper[4732]: E1010 06:51:14.526554 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.596094 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:14 crc kubenswrapper[4732]: W1010 06:51:14.636200 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:14 crc kubenswrapper[4732]: E1010 06:51:14.636348 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.665779 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"eb509023b0ba357388df3402a906d17a88aeb70106ae5eb2eee30639dd77446e"} Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.667149 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6b6555ca16b575fd8caf415f4e73a090fb94512ceb9a68f85b9815b3b5dc8c5f"} Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.668653 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"203163b14076f73863af58a6ddb58a582bf314f66369738b226e7171f4fcdfba"} Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.670140 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fcfe9e9c9142a41df2dc04ebf781273a829d7efc8137822a808249a0e6c6e19a"} Oct 10 06:51:14 crc kubenswrapper[4732]: I1010 06:51:14.671007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e2375e23956c0e8a45070b441f4f359be370a378a66e7c821560931d8ee3ec51"} Oct 10 06:51:15 crc kubenswrapper[4732]: E1010 06:51:15.008267 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="1.6s" Oct 10 06:51:15 crc kubenswrapper[4732]: W1010 06:51:15.079454 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:15 crc kubenswrapper[4732]: E1010 06:51:15.079592 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:15 crc kubenswrapper[4732]: W1010 06:51:15.082983 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:15 crc kubenswrapper[4732]: E1010 06:51:15.083102 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.267109 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.268493 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.268552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.268568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.268597 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:51:15 crc kubenswrapper[4732]: E1010 06:51:15.269067 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.596184 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.676144 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee" exitCode=0 Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.676238 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.676324 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.677622 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.677660 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.677670 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.680765 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.680768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.680828 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.680850 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.680873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.685398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.685465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.685483 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.686467 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787" exitCode=0 Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.686645 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.686676 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.687979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.688078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.688106 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.689274 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1" exitCode=0 Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.689380 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.689439 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.690423 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.690828 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.690876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.690901 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.691446 4732 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7" exitCode=0 Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.691490 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7"} Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.691552 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.691589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.691642 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.691652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.693178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.693221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:15 crc kubenswrapper[4732]: I1010 06:51:15.693245 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.404468 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.595605 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:16 crc kubenswrapper[4732]: E1010 06:51:16.609565 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.246:6443: connect: connection refused" interval="3.2s" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.696807 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9" exitCode=0 Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.696925 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.696991 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.699147 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.699237 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.699256 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.705437 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.705412 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.709391 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.709449 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.709471 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.709793 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.709830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.709844 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.709925 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.710879 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.710920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.710979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.717239 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.717259 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.717326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.717341 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.717352 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70"} Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.721358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.721392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.721405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:16 crc kubenswrapper[4732]: W1010 06:51:16.824339 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:16 crc kubenswrapper[4732]: E1010 06:51:16.824482 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.869202 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.871022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.871097 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.871113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:16 crc kubenswrapper[4732]: I1010 06:51:16.871150 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:51:16 crc kubenswrapper[4732]: E1010 06:51:16.872019 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.246:6443: connect: connection refused" node="crc" Oct 10 06:51:16 crc kubenswrapper[4732]: W1010 06:51:16.995038 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.246:6443: connect: connection refused Oct 10 06:51:16 crc kubenswrapper[4732]: E1010 06:51:16.995169 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.246:6443: connect: connection refused" logger="UnhandledError" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.503669 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.723318 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e" exitCode=0 Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.724048 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e"} Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.724173 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.726232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.726293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.726310 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.731872 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.733005 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.733726 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb"} Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.733819 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.733884 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.733943 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.734494 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.734530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.734495 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.734553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.734566 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.734584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.735008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.735045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.735062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.735938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.735984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:17 crc kubenswrapper[4732]: I1010 06:51:17.736003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.740797 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de"} Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.740848 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47"} Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.740871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba"} Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.740897 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.741001 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.741042 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.744074 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.744120 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.744139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.744186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.744249 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:18 crc kubenswrapper[4732]: I1010 06:51:18.744293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.014112 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.014603 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.016628 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.016670 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.016682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.274252 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.749310 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97"} Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.749385 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3"} Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.749431 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.750109 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.750351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.750551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.750573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.751574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.751638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:19 crc kubenswrapper[4732]: I1010 06:51:19.751662 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.072424 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.073961 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.074029 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.074052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.074100 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.751510 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.752040 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.752496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.752561 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.752584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.752858 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.752902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:20 crc kubenswrapper[4732]: I1010 06:51:20.752917 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:21 crc kubenswrapper[4732]: I1010 06:51:21.859141 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:21 crc kubenswrapper[4732]: I1010 06:51:21.859592 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:21 crc kubenswrapper[4732]: I1010 06:51:21.861387 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:21 crc kubenswrapper[4732]: I1010 06:51:21.861453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:21 crc kubenswrapper[4732]: I1010 06:51:21.861474 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:22 crc kubenswrapper[4732]: I1010 06:51:22.504255 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:22 crc kubenswrapper[4732]: I1010 06:51:22.504509 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:22 crc kubenswrapper[4732]: I1010 06:51:22.505982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:22 crc kubenswrapper[4732]: I1010 06:51:22.506013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:22 crc kubenswrapper[4732]: I1010 06:51:22.506023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.191170 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.191448 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.193297 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.193349 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.193360 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.198145 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.760996 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.762245 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.762301 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:23 crc kubenswrapper[4732]: I1010 06:51:23.762321 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:23 crc kubenswrapper[4732]: E1010 06:51:23.768842 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 10 06:51:24 crc kubenswrapper[4732]: I1010 06:51:24.241457 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 10 06:51:24 crc kubenswrapper[4732]: I1010 06:51:24.241714 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:24 crc kubenswrapper[4732]: I1010 06:51:24.242927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:24 crc kubenswrapper[4732]: I1010 06:51:24.242991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:24 crc kubenswrapper[4732]: I1010 06:51:24.243008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:24 crc kubenswrapper[4732]: I1010 06:51:24.859563 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:51:24 crc kubenswrapper[4732]: I1010 06:51:24.859664 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 06:51:26 crc kubenswrapper[4732]: I1010 06:51:26.020917 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 10 06:51:26 crc kubenswrapper[4732]: I1010 06:51:26.021018 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 10 06:51:26 crc kubenswrapper[4732]: I1010 06:51:26.435318 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 10 06:51:26 crc kubenswrapper[4732]: I1010 06:51:26.435582 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:26 crc kubenswrapper[4732]: I1010 06:51:26.437113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:26 crc kubenswrapper[4732]: I1010 06:51:26.437161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:26 crc kubenswrapper[4732]: I1010 06:51:26.437180 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:27 crc kubenswrapper[4732]: W1010 06:51:27.269414 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.269519 4732 trace.go:236] Trace[767488578]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Oct-2025 06:51:17.267) (total time: 10002ms): Oct 10 06:51:27 crc kubenswrapper[4732]: Trace[767488578]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:51:27.269) Oct 10 06:51:27 crc kubenswrapper[4732]: Trace[767488578]: [10.002126776s] [10.002126776s] END Oct 10 06:51:27 crc kubenswrapper[4732]: E1010 06:51:27.269544 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.478853 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.478931 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.509551 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.509612 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.510298 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.510456 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.511584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.511630 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:27 crc kubenswrapper[4732]: I1010 06:51:27.511643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:30 crc kubenswrapper[4732]: I1010 06:51:30.718508 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.482799 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.485274 4732 trace.go:236] Trace[2011643738]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Oct-2025 06:51:21.737) (total time: 10747ms): Oct 10 06:51:32 crc kubenswrapper[4732]: Trace[2011643738]: ---"Objects listed" error: 10747ms (06:51:32.485) Oct 10 06:51:32 crc kubenswrapper[4732]: Trace[2011643738]: [10.747858956s] [10.747858956s] END Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.485316 4732 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.485499 4732 trace.go:236] Trace[2133311142]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Oct-2025 06:51:18.058) (total time: 14426ms): Oct 10 06:51:32 crc kubenswrapper[4732]: Trace[2133311142]: ---"Objects listed" error: 14426ms (06:51:32.485) Oct 10 06:51:32 crc kubenswrapper[4732]: Trace[2133311142]: [14.426529945s] [14.426529945s] END Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.485528 4732 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.485883 4732 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.488263 4732 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.494166 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.513955 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.525631 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.529852 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50994->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.529932 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52714->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.529933 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50994->192.168.126.11:17697: read: connection reset by peer" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.530016 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:52714->192.168.126.11:17697: read: connection reset by peer" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.588264 4732 apiserver.go:52] "Watching apiserver" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.590828 4732 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.591099 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.591428 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.591499 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.591556 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.591580 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.591728 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.591885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.591935 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.591977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.592038 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.593909 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.594632 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.594908 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.594952 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.594978 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.595100 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.595140 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.595265 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.595313 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.598256 4732 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.620123 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.629340 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.641210 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.643635 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.658506 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.668942 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.679721 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689186 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689235 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689261 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689281 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689308 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689346 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689367 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689385 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689428 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689447 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689466 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689600 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689604 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689624 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689607 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689649 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689671 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689721 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689744 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689766 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689787 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689809 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689829 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689848 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689891 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689912 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689934 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689955 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689974 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689998 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690018 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690039 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690060 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690082 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.689860 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690078 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690599 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690767 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690776 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690784 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690938 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.690978 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691005 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691009 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691095 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691134 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691212 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691231 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.691383 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:51:33.1913557 +0000 UTC m=+20.260946941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.692567 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.692790 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691417 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691457 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691652 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691673 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.691722 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694231 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694350 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694422 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694425 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694487 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694521 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694550 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694576 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694604 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694629 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694651 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694676 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694719 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694772 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694797 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694823 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694848 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694870 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694897 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694922 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694946 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694965 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.694987 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695050 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695088 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695089 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695117 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695296 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695328 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695367 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695396 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695411 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695470 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695555 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695596 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695708 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.692007 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695793 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695848 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695876 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695902 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695928 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695953 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696383 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696450 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696504 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696534 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696584 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696614 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696709 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696741 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696796 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696849 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696876 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696960 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697045 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697073 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697124 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697363 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697526 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697586 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697619 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697645 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697667 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697868 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697903 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698061 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698098 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698124 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698167 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698225 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698291 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698573 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698603 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698636 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698781 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698844 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698894 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698922 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698973 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698998 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699056 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699126 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699236 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699264 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699336 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699427 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699455 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699656 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699769 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699823 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.699894 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695809 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696118 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696511 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696581 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696631 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696791 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.696942 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.700204 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.700265 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.700297 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.700324 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.700349 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701068 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701747 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701804 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701829 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701850 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701869 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701889 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701908 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701950 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.702066 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.702150 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703270 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704639 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704757 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706567 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706753 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706930 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706992 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.707522 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.708020 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709095 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709141 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709162 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709181 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709198 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709217 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709234 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709251 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709269 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709287 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709305 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709326 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709345 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.710304 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711306 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711325 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711344 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711360 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711380 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711399 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711416 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712304 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712380 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712399 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712420 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712444 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712469 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712493 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712517 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712542 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712565 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712585 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712622 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712642 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712660 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712681 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713301 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713357 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713383 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713412 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713438 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713463 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713569 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713607 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713649 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713707 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713735 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713758 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713782 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713809 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713926 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713950 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713976 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714001 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714203 4732 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714228 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714243 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714255 4732 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714268 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714280 4732 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714291 4732 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714304 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714316 4732 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714329 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714342 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714353 4732 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714365 4732 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714377 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714389 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714401 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714416 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714428 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714443 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714486 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714502 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714743 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714768 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714784 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714799 4732 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714810 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714823 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714839 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697021 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714893 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697339 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697378 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697414 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697449 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697626 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697725 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697794 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.697917 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698048 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698139 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698258 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698322 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698493 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698591 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698605 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698710 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.698950 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.700788 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.695458 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715365 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715425 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.700996 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701790 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701851 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.702098 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.702114 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701571 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701586 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703208 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.701336 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703514 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703545 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703549 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703784 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.702587 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703927 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.703962 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704180 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704415 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704436 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714853 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715708 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715726 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715739 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715752 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715767 4732 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715788 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715808 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715822 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.702973 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715840 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704624 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704641 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704931 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.704933 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.705342 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.705436 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.705616 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.705680 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.705811 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.705939 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.705989 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706180 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706303 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706398 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706480 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706677 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.706829 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.707116 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.707039 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.707561 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.707520 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.707908 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.707908 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.708827 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.708826 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.708896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.708903 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.708913 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709005 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709034 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709050 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709056 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709287 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709306 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709537 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709548 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709474 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.709987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.710023 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.710219 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.710246 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.710698 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.710883 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711117 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711229 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711306 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711345 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711625 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711746 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.711782 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712165 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712397 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712434 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712443 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.712828 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713176 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713355 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713534 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713542 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713606 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713833 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713858 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713872 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.713986 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714039 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.714561 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715262 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715898 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.716058 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.716061 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.716122 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.716194 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.716298 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.716732 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.716748 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.716952 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.717063 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.717081 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.717289 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.715159 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.718145 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.718270 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:33.218247932 +0000 UTC m=+20.287839173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.718742 4732 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.719301 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.719328 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.719456 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.719487 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.719744 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.719810 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.719883 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.720042 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:33.2199806 +0000 UTC m=+20.289571901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.720252 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.720312 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.720438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.720624 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.720676 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.723062 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.723970 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.728512 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.729889 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.732514 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.732784 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.733106 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.733256 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.733355 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.733603 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.734365 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.734391 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.734406 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.734462 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:33.234442839 +0000 UTC m=+20.304034160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.735998 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.736221 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.736242 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.736309 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:32 crc kubenswrapper[4732]: E1010 06:51:32.736380 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:33.236352141 +0000 UTC m=+20.305943382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.736533 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.738000 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.738515 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.738601 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.738861 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.738969 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.739046 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.739278 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.739457 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.739765 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.740844 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.740918 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.741651 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.741853 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.742077 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.742259 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.744081 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.746310 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.765610 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.784924 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.786916 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb" exitCode=255 Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.786957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb"} Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.787667 4732 scope.go:117] "RemoveContainer" containerID="b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.798988 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.809166 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816601 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816618 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816632 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816644 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816657 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816670 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816682 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816749 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816802 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816846 4732 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816881 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816892 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.816990 4732 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817006 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817023 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817045 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817064 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817077 4732 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817092 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817105 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817124 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817143 4732 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817156 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817170 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817184 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817197 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817211 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817223 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817242 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817255 4732 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817267 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817280 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817292 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817304 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817318 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817331 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817345 4732 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817360 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817372 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817384 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817396 4732 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817409 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817421 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817434 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817446 4732 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817458 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817470 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817483 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817496 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817508 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817520 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817533 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817546 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817558 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817570 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817581 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817593 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817609 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817620 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817631 4732 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817643 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817656 4732 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817668 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817681 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817715 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817727 4732 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817743 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817755 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817768 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817780 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817794 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817809 4732 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817822 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817838 4732 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817851 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817863 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817875 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817887 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817899 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817911 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817923 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817934 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817946 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817959 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817972 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817984 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.817996 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818009 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818024 4732 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818037 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818049 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818062 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818074 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818087 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818099 4732 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818112 4732 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818125 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818137 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818150 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818162 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818175 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818221 4732 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818258 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818270 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818320 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818333 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818348 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818403 4732 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818420 4732 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818436 4732 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818500 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818518 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818532 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818582 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818871 4732 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818929 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818948 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818965 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.818983 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819000 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819022 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819048 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819073 4732 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819092 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819107 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819120 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819133 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819146 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819159 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819173 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819185 4732 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819199 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819212 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819232 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819246 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819261 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819275 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819292 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819305 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819340 4732 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819361 4732 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819373 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819393 4732 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819412 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819425 4732 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819438 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819466 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819479 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819492 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819507 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819522 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819536 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819547 4732 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819559 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819572 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819586 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819599 4732 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.819612 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.821443 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.830573 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.842076 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.855129 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.865421 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.903320 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.912082 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 10 06:51:32 crc kubenswrapper[4732]: I1010 06:51:32.918264 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 10 06:51:32 crc kubenswrapper[4732]: W1010 06:51:32.923979 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-25afd4dd6db773abd6fc4f34750a174a0def2aa9a37bad2760f3e2ea4ce06af8 WatchSource:0}: Error finding container 25afd4dd6db773abd6fc4f34750a174a0def2aa9a37bad2760f3e2ea4ce06af8: Status 404 returned error can't find the container with id 25afd4dd6db773abd6fc4f34750a174a0def2aa9a37bad2760f3e2ea4ce06af8 Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.222184 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.222289 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.222350 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:51:34.222311558 +0000 UTC m=+21.291902819 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.222379 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.222406 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.222437 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:34.222418511 +0000 UTC m=+21.292009762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.222546 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.222597 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:34.222587605 +0000 UTC m=+21.292178956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.258096 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.265159 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.269401 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.270048 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.282973 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.292793 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.315249 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.322933 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.322981 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323132 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323159 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323172 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323175 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323205 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323218 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323237 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:34.323216552 +0000 UTC m=+21.392807793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.323278 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:34.323257963 +0000 UTC m=+21.392849204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.337156 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.351894 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.362884 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.374007 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.384228 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.396834 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.407665 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.419919 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.436051 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.452464 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.464737 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.659770 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.660003 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.665127 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.665910 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.667023 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.667935 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.668814 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.669081 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.669555 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.671735 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.672653 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.674433 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.675188 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.676681 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.677666 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.679040 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.679798 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.681088 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.682634 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.683270 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.684856 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.685476 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.686936 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.687967 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.688686 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.690306 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.690927 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.691814 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.692527 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.693281 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.694133 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.696248 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.696948 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.697464 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.698313 4732 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.698414 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.700348 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.701440 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.701724 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.701854 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.703326 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.704466 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.704957 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.706624 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.707296 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.708337 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.708930 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.709919 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.710476 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.711309 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.711895 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.712770 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.713602 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.713720 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.714435 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.714915 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.715370 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.716330 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.716864 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.717737 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.734614 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.753036 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.779062 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.790535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e9b845fc1f6f7b300b67224090a1648fa0e849bbca06b630ff375b7b5ff89803"} Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.792077 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db"} Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.792136 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9b60f50e20c9e5eef7ae14f22c3689c85042c939c21a64cdbbb97a5a29b9d055"} Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.793425 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7"} Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.793449 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f"} Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.793460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"25afd4dd6db773abd6fc4f34750a174a0def2aa9a37bad2760f3e2ea4ce06af8"} Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.795752 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.796099 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.798665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b"} Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.798716 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:33 crc kubenswrapper[4732]: E1010 06:51:33.806806 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.811284 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.825421 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.846927 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.859936 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.872427 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.884018 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.898637 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.912925 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.930324 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.948314 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.966341 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.976948 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:33 crc kubenswrapper[4732]: I1010 06:51:33.996173 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.009075 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.022099 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.036017 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.229174 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.229254 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.229280 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.229399 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.229461 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:36.229443242 +0000 UTC m=+23.299034483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.229544 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:51:36.229502124 +0000 UTC m=+23.299093365 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.229625 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.229786 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:36.229763441 +0000 UTC m=+23.299354682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.329940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.329992 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330117 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330138 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330149 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330117 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330225 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330237 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330200 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:36.330182451 +0000 UTC m=+23.399773692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.330284 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:36.330274274 +0000 UTC m=+23.399865515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.659279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:34 crc kubenswrapper[4732]: I1010 06:51:34.659315 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.659512 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:34 crc kubenswrapper[4732]: E1010 06:51:34.659604 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:35 crc kubenswrapper[4732]: I1010 06:51:35.659893 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:35 crc kubenswrapper[4732]: E1010 06:51:35.660115 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.249149 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.249259 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.249301 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:51:40.249272283 +0000 UTC m=+27.318863564 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.249345 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.249367 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.249452 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:40.249424357 +0000 UTC m=+27.319015638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.249525 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.249561 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:40.249554711 +0000 UTC m=+27.319145952 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.350103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.350173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350358 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350439 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350388 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350458 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350481 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350500 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350548 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:40.350524296 +0000 UTC m=+27.420115557 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.350593 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:40.350565887 +0000 UTC m=+27.420157218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.472751 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.493144 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.495355 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.509127 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.526298 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.548649 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.567936 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.584653 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.599523 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.619611 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.636921 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.651998 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.659644 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.659728 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.659876 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:36 crc kubenswrapper[4732]: E1010 06:51:36.660034 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.666492 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.677729 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.693243 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.728453 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.741031 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.755166 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.768975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.783567 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.807060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469"} Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.821307 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.834766 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.849538 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.867017 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.900524 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.920377 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.935787 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.951960 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:36 crc kubenswrapper[4732]: I1010 06:51:36.972002 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:37 crc kubenswrapper[4732]: I1010 06:51:37.659559 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:37 crc kubenswrapper[4732]: E1010 06:51:37.659731 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.659875 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.659908 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:38 crc kubenswrapper[4732]: E1010 06:51:38.660060 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:38 crc kubenswrapper[4732]: E1010 06:51:38.660154 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.703588 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pnlkp"] Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.703920 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-292kd"] Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.704103 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.704157 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.704112 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5r28v"] Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.704496 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2fxmr"] Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.704949 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.705233 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kdb2x"] Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.705527 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.705958 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.710766 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.710854 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.710896 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711162 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711179 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711406 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711459 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711539 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711728 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711836 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711856 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711914 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.711952 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.712043 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.712147 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.712345 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.712368 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.712501 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.712551 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.712708 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.717771 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.717973 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.742200 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.770226 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-rootfs\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771404 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-script-lib\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771431 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-node-log\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771464 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-k8s-cni-cncf-io\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771515 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-mcd-auth-proxy-config\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771543 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aef461dc-7905-4f5e-a90e-046ffcf8258d-cni-binary-copy\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771560 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-daemon-config\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771576 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-multus-certs\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771591 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-os-release\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771606 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-netns\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771661 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovn-node-metrics-cert\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771675 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-hostroot\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771719 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-systemd\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771735 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-env-overrides\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771751 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-cni-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-os-release\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771809 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-kubelet\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771830 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-cnibin\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771850 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-netd\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771872 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-socket-dir-parent\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-conf-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771912 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-etc-kubernetes\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771932 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771955 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-ovn\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.771982 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-netns\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772003 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0b9d04cf-acc2-45e0-8e1c-23c28c061af4-hosts-file\") pod \"node-resolver-5r28v\" (UID: \"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\") " pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772016 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-cni-binary-copy\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772032 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-proxy-tls\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772048 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-system-cni-dir\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772062 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-slash\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772089 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-systemd-units\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772106 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-var-lib-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772119 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-etc-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772137 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-kubelet\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772176 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-system-cni-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772189 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-cni-bin\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772203 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-cni-multus\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772216 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fnwz\" (UniqueName: \"kubernetes.io/projected/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-kube-api-access-4fnwz\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772238 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-log-socket\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-cnibin\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772265 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772279 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgdch\" (UniqueName: \"kubernetes.io/projected/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-kube-api-access-xgdch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772345 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fnk4\" (UniqueName: \"kubernetes.io/projected/0b9d04cf-acc2-45e0-8e1c-23c28c061af4-kube-api-access-8fnk4\") pod \"node-resolver-5r28v\" (UID: \"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\") " pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772360 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46xph\" (UniqueName: \"kubernetes.io/projected/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-kube-api-access-46xph\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772374 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aef461dc-7905-4f5e-a90e-046ffcf8258d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772390 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlqtl\" (UniqueName: \"kubernetes.io/projected/aef461dc-7905-4f5e-a90e-046ffcf8258d-kube-api-access-qlqtl\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772404 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-bin\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.772417 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-config\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.779655 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.789594 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.800015 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.812634 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.823571 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.838249 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.851034 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.872962 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-ovn\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873016 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873034 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-netns\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0b9d04cf-acc2-45e0-8e1c-23c28c061af4-hosts-file\") pod \"node-resolver-5r28v\" (UID: \"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\") " pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873066 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-cni-binary-copy\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873086 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-proxy-tls\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873101 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-system-cni-dir\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873117 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-slash\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873132 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-systemd-units\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-ovn\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873181 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-var-lib-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873147 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-var-lib-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873190 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873231 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-etc-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873244 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0b9d04cf-acc2-45e0-8e1c-23c28c061af4-hosts-file\") pod \"node-resolver-5r28v\" (UID: \"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\") " pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873284 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873255 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873316 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-systemd-units\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873322 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-system-cni-dir\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873321 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-system-cni-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-etc-openvswitch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873360 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-system-cni-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873287 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-slash\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873408 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-cni-bin\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873375 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-cni-bin\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873418 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-netns\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873449 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-kubelet\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-log-socket\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873516 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-cni-multus\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873536 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fnwz\" (UniqueName: \"kubernetes.io/projected/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-kube-api-access-4fnwz\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-kubelet\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873564 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-cnibin\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873565 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-log-socket\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgdch\" (UniqueName: \"kubernetes.io/projected/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-kube-api-access-xgdch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873620 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-var-lib-cni-multus\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873629 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fnk4\" (UniqueName: \"kubernetes.io/projected/0b9d04cf-acc2-45e0-8e1c-23c28c061af4-kube-api-access-8fnk4\") pod \"node-resolver-5r28v\" (UID: \"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\") " pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873671 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlqtl\" (UniqueName: \"kubernetes.io/projected/aef461dc-7905-4f5e-a90e-046ffcf8258d-kube-api-access-qlqtl\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-bin\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873760 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-config\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873791 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46xph\" (UniqueName: \"kubernetes.io/projected/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-kube-api-access-46xph\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873823 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aef461dc-7905-4f5e-a90e-046ffcf8258d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873847 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-rootfs\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-script-lib\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873888 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-node-log\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873910 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-k8s-cni-cncf-io\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873931 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aef461dc-7905-4f5e-a90e-046ffcf8258d-cni-binary-copy\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873953 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-daemon-config\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873977 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-multus-certs\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873993 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-bin\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874009 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-mcd-auth-proxy-config\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874038 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-os-release\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-netns\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874075 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-cnibin\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874131 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovn-node-metrics-cert\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874164 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-hostroot\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874181 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-netns\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874158 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-node-log\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874226 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-os-release\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874324 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-kubelet\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874378 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-systemd\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874410 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-env-overrides\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874440 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-cni-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874462 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aef461dc-7905-4f5e-a90e-046ffcf8258d-os-release\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874470 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-cnibin\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874170 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-multus-certs\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874496 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-netd\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.873795 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-cni-binary-copy\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874195 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-host-run-k8s-cni-cncf-io\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874591 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-ovn-kubernetes\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874609 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-socket-dir-parent\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874625 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-systemd\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-conf-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874661 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-etc-kubernetes\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874723 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-config\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-etc-kubernetes\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.874985 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-mcd-auth-proxy-config\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875050 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-cni-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875061 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-socket-dir-parent\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875086 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-conf-dir\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875127 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-cnibin\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aef461dc-7905-4f5e-a90e-046ffcf8258d-cni-binary-copy\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875156 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-env-overrides\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875163 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-netd\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-kubelet\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875207 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-rootfs\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875220 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-os-release\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-hostroot\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875320 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-multus-daemon-config\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.875758 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-script-lib\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.876062 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aef461dc-7905-4f5e-a90e-046ffcf8258d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.878555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-proxy-tls\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.878672 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovn-node-metrics-cert\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.888259 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.894194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlqtl\" (UniqueName: \"kubernetes.io/projected/aef461dc-7905-4f5e-a90e-046ffcf8258d-kube-api-access-qlqtl\") pod \"multus-additional-cni-plugins-2fxmr\" (UID: \"aef461dc-7905-4f5e-a90e-046ffcf8258d\") " pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.894288 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.896041 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.896074 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.896086 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.896208 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.899650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46xph\" (UniqueName: \"kubernetes.io/projected/1ca39c55-1a82-41b2-b7d5-925320a4e8a0-kube-api-access-46xph\") pod \"machine-config-daemon-292kd\" (UID: \"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\") " pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.903747 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fnwz\" (UniqueName: \"kubernetes.io/projected/d94cc3c3-3cb6-4a5b-996b-90099415f9bf-kube-api-access-4fnwz\") pod \"multus-pnlkp\" (UID: \"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\") " pod="openshift-multus/multus-pnlkp" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.905087 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgdch\" (UniqueName: \"kubernetes.io/projected/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-kube-api-access-xgdch\") pod \"ovnkube-node-kdb2x\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.905680 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fnk4\" (UniqueName: \"kubernetes.io/projected/0b9d04cf-acc2-45e0-8e1c-23c28c061af4-kube-api-access-8fnk4\") pod \"node-resolver-5r28v\" (UID: \"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\") " pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.908904 4732 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.909219 4732 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.910494 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.910552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.910727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.910745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.910765 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.910776 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:38Z","lastTransitionTime":"2025-10-10T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.934736 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: E1010 06:51:38.939949 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.944124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.944163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.944174 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.944196 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.944206 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:38Z","lastTransitionTime":"2025-10-10T06:51:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.966118 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:38 crc kubenswrapper[4732]: I1010 06:51:38.995816 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: E1010 06:51:39.001623 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:38Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.005210 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.005285 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.005304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.005329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.005351 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.021217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pnlkp" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.031456 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.031530 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.040495 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.047199 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5r28v" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.054046 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:39 crc kubenswrapper[4732]: E1010 06:51:39.060586 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.068085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.068125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.068136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.068155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.068165 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.068868 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: E1010 06:51:39.085873 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.088415 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.089977 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.089998 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.090007 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.090021 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.090031 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: W1010 06:51:39.093935 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf77a19b4_118c_4b7d_9ef2_b7be7fd33e63.slice/crio-af9c9e0e18853b2aa63161efdfd53cbb41ea8447255e61b04bd6b8c6058b43e1 WatchSource:0}: Error finding container af9c9e0e18853b2aa63161efdfd53cbb41ea8447255e61b04bd6b8c6058b43e1: Status 404 returned error can't find the container with id af9c9e0e18853b2aa63161efdfd53cbb41ea8447255e61b04bd6b8c6058b43e1 Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.104160 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: E1010 06:51:39.106476 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: E1010 06:51:39.106619 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.109210 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.109257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.109271 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.109293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.109308 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.116636 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.139681 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.154728 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.176466 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.197064 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.210817 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.214869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.214913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.214927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.214948 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.214960 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.224344 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.238119 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.317009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.317049 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.317062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.317080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.317091 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.419267 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.419326 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.419337 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.419356 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.419367 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.521589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.521629 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.521638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.521651 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.521660 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.624955 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.625028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.625048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.625075 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.625091 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.659569 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:39 crc kubenswrapper[4732]: E1010 06:51:39.659743 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.727338 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.727380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.727388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.727403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.727414 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.818026 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5r28v" event={"ID":"0b9d04cf-acc2-45e0-8e1c-23c28c061af4","Type":"ContainerStarted","Data":"bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.818077 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5r28v" event={"ID":"0b9d04cf-acc2-45e0-8e1c-23c28c061af4","Type":"ContainerStarted","Data":"16ecb724add659aaf86a3b92c2515222ea8c20c90b08551d1c3400ad203616b9"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.819568 4732 generic.go:334] "Generic (PLEG): container finished" podID="aef461dc-7905-4f5e-a90e-046ffcf8258d" containerID="3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc" exitCode=0 Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.819643 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerDied","Data":"3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.819675 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerStarted","Data":"13877ace56aad62e369b90dbecab0cfdaf84353d8baa37a009a77eaab04f640c"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.821206 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.821232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.821245 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"c43bf64990c43643ee14383ca5c2af9b8505fc712d9b46b14d67df2f28a9f5cd"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.822604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerStarted","Data":"8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.822631 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerStarted","Data":"6f32335939b1986b860cfc43d4c434d7e86efadc7b15632e209d666cd968ba51"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.824038 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0" exitCode=0 Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.824085 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.824118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"af9c9e0e18853b2aa63161efdfd53cbb41ea8447255e61b04bd6b8c6058b43e1"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.829398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.829440 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.829453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.829471 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.829487 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.834914 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.851235 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.870409 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.882106 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.898936 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.912199 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.926166 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.933368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.933421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.933433 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.933454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.933470 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:39Z","lastTransitionTime":"2025-10-10T06:51:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.949508 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.962731 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:39 crc kubenswrapper[4732]: I1010 06:51:39.981203 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:39Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.010920 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.027670 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.035646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.035714 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.035727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.035748 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.035763 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.043361 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.056849 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.075298 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.088349 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.100758 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.112787 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.125458 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.138228 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.139714 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.139758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.139767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.139783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.139796 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.148777 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.162407 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.177043 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.183259 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jn2jn"] Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.183727 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.185864 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.186211 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.186410 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.187542 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.191331 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.210093 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.222922 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.236174 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.241940 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.241979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.241988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.242001 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.242051 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.253913 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.265899 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.276416 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.288446 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.288561 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:51:48.288542416 +0000 UTC m=+35.358133657 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.288489 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.288762 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.288792 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-host\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.288812 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-kube-api-access-8hwpn\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.288835 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.288914 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.288920 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-serviceca\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.288933 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.288948 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:48.288940957 +0000 UTC m=+35.358532198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.289011 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:48.288983089 +0000 UTC m=+35.358574400 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.309775 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.323085 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.338078 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.343928 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.343966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.343975 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.343990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.344001 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.358644 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.373913 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.386739 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.390619 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-host\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.390869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-kube-api-access-8hwpn\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.390932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.390966 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.391001 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-serviceca\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.391602 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-host\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.391746 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.391787 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.391804 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.391858 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:48.391842266 +0000 UTC m=+35.461433507 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.392183 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-serviceca\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.392283 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.392309 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.392320 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.392356 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:48.39234041 +0000 UTC m=+35.461931651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.406776 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.420256 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.425798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hwpn\" (UniqueName: \"kubernetes.io/projected/c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737-kube-api-access-8hwpn\") pod \"node-ca-jn2jn\" (UID: \"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\") " pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.430967 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.445486 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.446420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.446461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.446475 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.446503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.446517 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.461464 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.470795 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.549009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.549372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.549386 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.549405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.549416 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.588684 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jn2jn" Oct 10 06:51:40 crc kubenswrapper[4732]: W1010 06:51:40.603286 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6e2c9ca_34c4_4d36_9ac4_0e1f6b665737.slice/crio-631e4f6c608fa93f0839e5fb3d07686a0a53ff03e924123fc34f2084a1024f3f WatchSource:0}: Error finding container 631e4f6c608fa93f0839e5fb3d07686a0a53ff03e924123fc34f2084a1024f3f: Status 404 returned error can't find the container with id 631e4f6c608fa93f0839e5fb3d07686a0a53ff03e924123fc34f2084a1024f3f Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.651869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.651912 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.651924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.651941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.651954 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.659194 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.659195 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.659382 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:40 crc kubenswrapper[4732]: E1010 06:51:40.659517 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.755130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.755207 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.755221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.755268 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.755283 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.831583 4732 generic.go:334] "Generic (PLEG): container finished" podID="aef461dc-7905-4f5e-a90e-046ffcf8258d" containerID="738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99" exitCode=0 Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.831642 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerDied","Data":"738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.836035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.836081 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.836093 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.836107 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.836121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.837938 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jn2jn" event={"ID":"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737","Type":"ContainerStarted","Data":"055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.837988 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jn2jn" event={"ID":"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737","Type":"ContainerStarted","Data":"631e4f6c608fa93f0839e5fb3d07686a0a53ff03e924123fc34f2084a1024f3f"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.851654 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.859023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.859063 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.859073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.859092 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.859102 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.870322 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.888946 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.908325 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.928283 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.942277 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.955091 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.962917 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.962975 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.962990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.963010 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.963023 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:40Z","lastTransitionTime":"2025-10-10T06:51:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.967151 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.980973 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:40 crc kubenswrapper[4732]: I1010 06:51:40.992548 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.004458 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.018763 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.035403 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.047001 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.063449 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.065303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.065352 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.065362 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.065380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.065392 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.076285 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.094212 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.112072 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.130137 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.142011 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.154214 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.167661 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.167723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.167733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.167750 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.167761 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.181523 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.218913 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.234156 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.247072 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.295964 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.295998 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.296009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.296024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.296034 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.299952 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.308369 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.318302 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.330048 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.357962 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.397808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.397866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.397879 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.397905 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.397918 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.501202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.501248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.501259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.501276 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.501287 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.604184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.604231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.604243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.604260 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.604272 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.660289 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:41 crc kubenswrapper[4732]: E1010 06:51:41.660505 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.706941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.706990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.707000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.707016 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.707029 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.809102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.809158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.809172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.809189 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.809202 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.844037 4732 generic.go:334] "Generic (PLEG): container finished" podID="aef461dc-7905-4f5e-a90e-046ffcf8258d" containerID="91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a" exitCode=0 Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.844119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerDied","Data":"91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.849870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.863258 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.881872 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.902835 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.915137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.915167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.915177 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.915194 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.915205 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:41Z","lastTransitionTime":"2025-10-10T06:51:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.918471 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.932495 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.946925 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.960259 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.971732 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:41 crc kubenswrapper[4732]: I1010 06:51:41.985008 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:41Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.005341 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.017807 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.017852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.017861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.017879 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.017889 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.019680 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.036137 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.051280 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.068950 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.088942 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.119891 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.119926 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.119936 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.119950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.119959 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.222614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.222665 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.222677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.222723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.222746 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.325184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.325228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.325240 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.325257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.325270 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.428280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.428337 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.428354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.428445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.428459 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.531669 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.531747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.531761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.531786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.531804 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.634494 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.634542 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.634555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.634574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.634587 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.659975 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.659986 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:42 crc kubenswrapper[4732]: E1010 06:51:42.660124 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:42 crc kubenswrapper[4732]: E1010 06:51:42.660270 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.737452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.737499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.737515 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.737539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.737556 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.840355 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.840434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.840456 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.840485 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.840506 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.856650 4732 generic.go:334] "Generic (PLEG): container finished" podID="aef461dc-7905-4f5e-a90e-046ffcf8258d" containerID="b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2" exitCode=0 Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.856721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerDied","Data":"b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.875278 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.895390 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.908797 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.920397 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.933357 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.945486 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.947937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.947991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.948002 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.948022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.948035 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:42Z","lastTransitionTime":"2025-10-10T06:51:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.960458 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.971828 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.986602 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:42 crc kubenswrapper[4732]: I1010 06:51:42.998837 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:42Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.013055 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.030857 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.044728 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.052006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.052031 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.052040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.052057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.052066 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.060592 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.115807 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.155467 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.155504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.155514 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.155534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.155544 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.257856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.258128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.258193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.258269 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.258365 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.361801 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.361854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.361864 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.361885 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.361896 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.464153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.464216 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.464229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.464246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.464257 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.566950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.567658 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.567780 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.567883 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.567957 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.659988 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:43 crc kubenswrapper[4732]: E1010 06:51:43.660554 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.671103 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.671184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.671199 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.671232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.671251 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.676812 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.692713 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.704987 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.721284 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.739307 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.757805 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.772798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.772846 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.772858 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.772874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.772885 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.787022 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.800124 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.820632 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.838546 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.852632 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.864893 4732 generic.go:334] "Generic (PLEG): container finished" podID="aef461dc-7905-4f5e-a90e-046ffcf8258d" containerID="e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0" exitCode=0 Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.864959 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerDied","Data":"e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.869422 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.871799 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.876067 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.876125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.876138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.876158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.876170 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.888313 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.901338 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.917709 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.932427 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.947034 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.964199 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.979114 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.979879 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.979925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.979937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.979955 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.979965 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:43Z","lastTransitionTime":"2025-10-10T06:51:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:43 crc kubenswrapper[4732]: I1010 06:51:43.996086 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.019898 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.040980 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.055679 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.069170 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.082839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.082886 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.082898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.082916 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.082928 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.084254 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.099552 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.121430 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.135460 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.145247 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.155586 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.185082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.185138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.185149 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.185168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.185179 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.288375 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.288422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.288434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.288453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.288465 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.391466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.391547 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.391572 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.391598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.391645 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.493976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.494047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.494071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.494100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.494124 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.596576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.596636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.596650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.596682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.596745 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.659754 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.659788 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:44 crc kubenswrapper[4732]: E1010 06:51:44.660208 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:44 crc kubenswrapper[4732]: E1010 06:51:44.660374 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.698861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.698899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.698909 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.698923 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.698933 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.802151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.802245 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.802273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.802300 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.802323 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.875724 4732 generic.go:334] "Generic (PLEG): container finished" podID="aef461dc-7905-4f5e-a90e-046ffcf8258d" containerID="7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6" exitCode=0 Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.875781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerDied","Data":"7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.896751 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.904496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.904552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.904569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.904590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.904607 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:44Z","lastTransitionTime":"2025-10-10T06:51:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.914211 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.929545 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.949155 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.964729 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:44 crc kubenswrapper[4732]: I1010 06:51:44.984455 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:44Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.009255 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.009315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.009330 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.009353 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.009366 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.016925 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.032055 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.049552 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.062162 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.074271 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.087207 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.101130 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.111540 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.111583 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.111593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.111609 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.111621 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.115127 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.128674 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.214084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.214136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.214144 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.214160 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.214170 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.316851 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.316887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.316898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.316913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.316922 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.419641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.419706 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.419722 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.419740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.419754 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.521663 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.521715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.521725 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.521739 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.521749 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.624910 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.624984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.625008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.625038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.625061 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.660021 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:45 crc kubenswrapper[4732]: E1010 06:51:45.660203 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.727907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.727965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.727978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.728002 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.728017 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.830929 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.830975 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.830989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.831014 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.831027 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.884028 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.884471 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.889496 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" event={"ID":"aef461dc-7905-4f5e-a90e-046ffcf8258d","Type":"ContainerStarted","Data":"e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.902227 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.913048 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.921663 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.933666 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.933732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.933748 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.933771 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.933788 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:45Z","lastTransitionTime":"2025-10-10T06:51:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.950233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.982110 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:45 crc kubenswrapper[4732]: I1010 06:51:45.999390 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:45Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.021788 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.025832 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.036383 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.036473 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.036498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.036530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.036553 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.041500 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.058309 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.077815 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.090101 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.107848 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.123530 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.138117 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.139334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.139432 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.139451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.139474 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.139489 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.149145 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.159992 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.171549 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.182474 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.193237 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.205776 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.217088 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.226467 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.237415 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.241943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.241988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.242000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.242017 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.242028 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.249891 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.263886 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.277180 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.289352 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.302893 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.316877 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.341670 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.344447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.344484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.344494 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.344511 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.344524 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.366074 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.446878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.446965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.446979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.446998 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.447011 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.549606 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.549660 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.549677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.549751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.549780 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.652406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.652455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.652466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.652483 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.652493 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.660060 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.660099 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:46 crc kubenswrapper[4732]: E1010 06:51:46.660263 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:46 crc kubenswrapper[4732]: E1010 06:51:46.660492 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.755445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.755496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.755510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.755529 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.755545 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.858890 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.858967 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.859006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.859039 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.859062 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.892914 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.893308 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.915624 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.928182 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.942299 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.957779 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.961932 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.961978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.961989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.962005 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.962017 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:46Z","lastTransitionTime":"2025-10-10T06:51:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.973595 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.983914 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:46 crc kubenswrapper[4732]: I1010 06:51:46.996021 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:46Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.007991 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.021534 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.047011 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.060982 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.064080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.064126 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.064138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.064152 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.064160 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.072814 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.093250 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.109251 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.123863 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.134114 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:47Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.166164 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.166222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.166239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.166260 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.166277 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.269016 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.269061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.269070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.269084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.269116 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.371536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.371593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.371603 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.371619 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.371628 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.473704 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.473742 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.473751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.473765 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.473775 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.577509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.577847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.577889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.577920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.577934 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.660315 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:47 crc kubenswrapper[4732]: E1010 06:51:47.660572 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.682210 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.682244 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.682256 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.682272 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.682283 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.784457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.784502 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.784513 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.784529 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.784539 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.886568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.886602 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.886612 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.886627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.886635 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.895635 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.989981 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.990048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.990064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.990092 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:47 crc kubenswrapper[4732]: I1010 06:51:47.990105 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:47Z","lastTransitionTime":"2025-10-10T06:51:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.092154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.092227 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.092263 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.092280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.092292 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.196136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.196187 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.196201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.196220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.196236 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.247498 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.299283 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.299334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.299347 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.299366 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.299376 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.378562 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.378761 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.378862 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.378864 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:52:04.378821556 +0000 UTC m=+51.448412807 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.379016 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.379063 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:04.379003121 +0000 UTC m=+51.448594362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.379202 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.379266 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:04.379251068 +0000 UTC m=+51.448842329 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.402578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.402624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.402643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.403840 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.403891 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.479757 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.479818 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.479970 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.479984 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.479998 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.479998 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.480009 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.480018 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.480072 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:04.480053999 +0000 UTC m=+51.549645240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.480089 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:04.48008419 +0000 UTC m=+51.549675431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.507771 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.507848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.507867 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.507897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.507918 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.611261 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.611321 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.611335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.611360 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.611375 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.659969 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.660083 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.660149 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:48 crc kubenswrapper[4732]: E1010 06:51:48.660319 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.714178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.714239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.714257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.714281 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.714298 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.817322 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.817361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.817371 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.817392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.817407 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.901872 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/0.log" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.905255 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f" exitCode=1 Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.905309 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.906210 4732 scope.go:117] "RemoveContainer" containerID="8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.920114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.920164 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.920176 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.920194 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.920207 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:48Z","lastTransitionTime":"2025-10-10T06:51:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.931821 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.947958 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.963434 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.983996 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:48 crc kubenswrapper[4732]: I1010 06:51:48.998735 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:48Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.011730 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.023088 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.023305 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.023348 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.023360 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.023381 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.023395 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.040961 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.057739 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.074765 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.094523 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.110742 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.126199 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.126226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.126236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.126252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.126261 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.130864 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.157814 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:48Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113205 6035 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1010 06:51:48.113249 6035 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.113301 6035 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113490 6035 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113952 6035 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.114120 6035 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.114290 6035 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.189613 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.225683 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.225735 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.225744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.225762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.225775 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.237801 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.241372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.241404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.241415 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.241432 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.241444 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.257087 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.261345 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.261388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.261398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.261415 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.261425 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.275113 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.279830 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.279882 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.279894 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.279937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.279949 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.297131 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.301773 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.301846 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.301861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.301879 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.301890 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.317199 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.317320 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.319167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.319233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.319244 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.319264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.319274 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.422055 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.422090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.422100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.422113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.422122 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.524139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.524191 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.524206 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.524222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.524232 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.628313 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.628359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.628368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.628385 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.628395 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.659663 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.659821 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.731498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.731537 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.731549 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.731566 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.731586 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.836009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.836062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.836079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.836102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.836119 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.910165 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/1.log" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.910619 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/0.log" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.913255 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e" exitCode=1 Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.913307 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.913381 4732 scope.go:117] "RemoveContainer" containerID="8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.914171 4732 scope.go:117] "RemoveContainer" containerID="3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e" Oct 10 06:51:49 crc kubenswrapper[4732]: E1010 06:51:49.914929 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.930894 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.938275 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.938317 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.938329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.938344 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.938355 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:49Z","lastTransitionTime":"2025-10-10T06:51:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.949459 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.961082 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.971549 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.981333 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:49 crc kubenswrapper[4732]: I1010 06:51:49.996237 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.011031 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.029975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.040499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.040536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.040546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.040564 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.040575 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.052309 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.072058 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.091754 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.120717 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:48Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113205 6035 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1010 06:51:48.113249 6035 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.113301 6035 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113490 6035 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113952 6035 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.114120 6035 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.114290 6035 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.140447 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.142484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.142530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.142540 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.142552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.142560 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.159873 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.176189 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.245155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.245190 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.245199 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.245212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.245221 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.348438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.348487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.348496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.348512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.348524 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.451351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.451400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.451411 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.451430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.451441 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.556496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.557185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.557214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.557241 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.557260 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.606443 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w"] Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.606938 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.610002 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.610121 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.624830 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.641633 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.659214 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.659267 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:50 crc kubenswrapper[4732]: E1010 06:51:50.659369 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:50 crc kubenswrapper[4732]: E1010 06:51:50.659464 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.661153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.661184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.661197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.661214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.661227 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.663024 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.681748 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.702288 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f8094ef-2a6d-4a6c-add7-628eff37abe3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.702373 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxrsh\" (UniqueName: \"kubernetes.io/projected/3f8094ef-2a6d-4a6c-add7-628eff37abe3-kube-api-access-xxrsh\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.702473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f8094ef-2a6d-4a6c-add7-628eff37abe3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.702720 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f8094ef-2a6d-4a6c-add7-628eff37abe3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.707027 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.720673 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.735600 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.752263 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.763944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.764040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.764089 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.764121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.764152 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.769317 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.785743 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.800490 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.805998 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f8094ef-2a6d-4a6c-add7-628eff37abe3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.806095 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxrsh\" (UniqueName: \"kubernetes.io/projected/3f8094ef-2a6d-4a6c-add7-628eff37abe3-kube-api-access-xxrsh\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.806164 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f8094ef-2a6d-4a6c-add7-628eff37abe3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.807066 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f8094ef-2a6d-4a6c-add7-628eff37abe3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.807132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f8094ef-2a6d-4a6c-add7-628eff37abe3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.809989 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f8094ef-2a6d-4a6c-add7-628eff37abe3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.817447 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f8094ef-2a6d-4a6c-add7-628eff37abe3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.827951 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxrsh\" (UniqueName: \"kubernetes.io/projected/3f8094ef-2a6d-4a6c-add7-628eff37abe3-kube-api-access-xxrsh\") pod \"ovnkube-control-plane-749d76644c-tvj9w\" (UID: \"3f8094ef-2a6d-4a6c-add7-628eff37abe3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.828587 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.846832 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.860334 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.867306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.867348 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.867359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.867377 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.867390 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.882969 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ad2ec139b4886e9226b028c5d68e1afb19b4c4e728ad03a90ccbd8c5c6d3c3f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:48Z\\\",\\\"message\\\":\\\"from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113205 6035 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1010 06:51:48.113249 6035 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.113301 6035 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113490 6035 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1010 06:51:48.113952 6035 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.114120 6035 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1010 06:51:48.114290 6035 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.906400 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.917992 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/1.log" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.922407 4732 scope.go:117] "RemoveContainer" containerID="3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e" Oct 10 06:51:50 crc kubenswrapper[4732]: E1010 06:51:50.922601 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.927307 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.935570 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.951413 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.970118 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.973468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.973505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.973514 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.973528 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.973538 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:50Z","lastTransitionTime":"2025-10-10T06:51:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:50 crc kubenswrapper[4732]: I1010 06:51:50.990605 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:50Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.009106 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.026233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.048722 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.064453 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.075983 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.076045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.076064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.076092 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.076111 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.084520 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.101147 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.117951 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.132032 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.149998 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.168081 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.180260 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.180263 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.180327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.180576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.180616 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.180632 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.195286 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.284191 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.284429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.284589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.284766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.284945 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.388489 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.388550 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.388573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.388601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.388628 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.491624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.491673 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.491686 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.491735 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.491752 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.595544 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.595638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.595663 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.595740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.595817 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.659587 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:51 crc kubenswrapper[4732]: E1010 06:51:51.660243 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.699165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.699238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.699265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.699298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.699322 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.801353 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.801419 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.801435 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.801461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.801479 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.905907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.905995 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.906015 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.906043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.906063 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:51Z","lastTransitionTime":"2025-10-10T06:51:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.928221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" event={"ID":"3f8094ef-2a6d-4a6c-add7-628eff37abe3","Type":"ContainerStarted","Data":"6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.928281 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" event={"ID":"3f8094ef-2a6d-4a6c-add7-628eff37abe3","Type":"ContainerStarted","Data":"5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.928295 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" event={"ID":"3f8094ef-2a6d-4a6c-add7-628eff37abe3","Type":"ContainerStarted","Data":"1ff1f5360f4a52880b245211bdae7777b8ba88cef6772d936b43cb3c29a04966"} Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.946822 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.959169 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.973508 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:51 crc kubenswrapper[4732]: I1010 06:51:51.996843 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.009031 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.009082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.009093 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.009114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.009128 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.019773 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.040754 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.067856 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.084679 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.101182 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.111588 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.111797 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.111885 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.111962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.112037 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.115294 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.127616 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.142577 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.155079 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.170464 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.184820 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.195113 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.214860 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.214962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.215018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.215087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.215155 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.317658 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.317775 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.317801 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.317833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.317861 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.421023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.421098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.421125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.421156 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.421179 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.471884 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-mj7bk"] Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.472650 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:52 crc kubenswrapper[4732]: E1010 06:51:52.472883 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.493162 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.513900 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.523565 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.523591 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.523605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.523623 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.523636 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.526075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spk7r\" (UniqueName: \"kubernetes.io/projected/77abff23-1622-4219-a841-49fe8dbb6cc3-kube-api-access-spk7r\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.526108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.534226 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.552166 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.565994 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.577895 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.589327 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.600006 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.611233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.627875 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spk7r\" (UniqueName: \"kubernetes.io/projected/77abff23-1622-4219-a841-49fe8dbb6cc3-kube-api-access-spk7r\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.628120 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.628214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: E1010 06:51:52.628216 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.628252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.628265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.628283 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: E1010 06:51:52.628310 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:53.128281948 +0000 UTC m=+40.197873199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.628299 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.630533 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.643430 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.650123 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spk7r\" (UniqueName: \"kubernetes.io/projected/77abff23-1622-4219-a841-49fe8dbb6cc3-kube-api-access-spk7r\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.657416 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.659681 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:52 crc kubenswrapper[4732]: E1010 06:51:52.659802 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.660019 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:52 crc kubenswrapper[4732]: E1010 06:51:52.660095 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.669344 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.688726 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.704650 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.722333 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.730752 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.730785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.730794 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.730806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.730816 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.742927 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:52Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.833846 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.833893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.833903 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.833940 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.833954 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.937005 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.937087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.937107 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.937141 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:52 crc kubenswrapper[4732]: I1010 06:51:52.937162 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:52Z","lastTransitionTime":"2025-10-10T06:51:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.041054 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.041112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.041128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.041152 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.041166 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.131944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:53 crc kubenswrapper[4732]: E1010 06:51:53.132148 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:53 crc kubenswrapper[4732]: E1010 06:51:53.132223 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:54.13220266 +0000 UTC m=+41.201793901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.143977 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.144079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.144102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.144139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.144167 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.246979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.247058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.247084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.247116 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.247146 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.350662 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.350766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.350780 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.350801 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.350815 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.454646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.454727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.454743 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.454767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.454786 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.557677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.557767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.557783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.557805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.557823 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.659248 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:53 crc kubenswrapper[4732]: E1010 06:51:53.659520 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.660389 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.660444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.660462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.660492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.660523 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.676390 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.691897 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.709572 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.721343 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.740616 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.757360 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.763401 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.763452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.763471 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.763496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.763514 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.782458 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.815773 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.831511 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.848003 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.864257 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.865758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.865783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.865791 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.865807 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.865816 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.878022 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.893662 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.913444 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.931522 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.946876 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.959680 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.968086 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.968123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.968132 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.968145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:53 crc kubenswrapper[4732]: I1010 06:51:53.968156 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:53Z","lastTransitionTime":"2025-10-10T06:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.071099 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.071445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.071568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.071730 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.071864 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.143448 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:54 crc kubenswrapper[4732]: E1010 06:51:54.143847 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:54 crc kubenswrapper[4732]: E1010 06:51:54.143955 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:51:56.14393312 +0000 UTC m=+43.213524441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.175301 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.175335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.175345 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.175359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.175368 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.277731 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.278028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.278113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.278198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.278274 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.381344 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.381391 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.381403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.381421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.381434 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.484267 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.484306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.484315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.484331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.484341 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.589816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.589848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.589861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.589876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.589886 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.660160 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.660167 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:54 crc kubenswrapper[4732]: E1010 06:51:54.660326 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:54 crc kubenswrapper[4732]: E1010 06:51:54.660376 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.660669 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:54 crc kubenswrapper[4732]: E1010 06:51:54.661558 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.692438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.692490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.692501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.692522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.692538 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.795354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.795452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.795471 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.795496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.795514 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.898594 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.898869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.898915 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.898943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:54 crc kubenswrapper[4732]: I1010 06:51:54.898962 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:54Z","lastTransitionTime":"2025-10-10T06:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.001747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.001805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.001823 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.001844 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.001861 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.106167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.106226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.106236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.106255 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.106268 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.210105 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.210161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.210172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.210193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.210210 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.313062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.313111 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.313124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.313143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.313156 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.415523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.415573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.415589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.415605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.415616 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.519484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.519544 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.519562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.519581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.519599 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.623078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.623142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.623160 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.623184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.623203 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.659421 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:55 crc kubenswrapper[4732]: E1010 06:51:55.659543 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.725431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.725503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.725514 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.725530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.725540 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.827628 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.827671 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.827681 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.827718 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.827727 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.930819 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.930875 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.930895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.930920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:55 crc kubenswrapper[4732]: I1010 06:51:55.930937 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:55Z","lastTransitionTime":"2025-10-10T06:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.033939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.034042 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.034060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.034084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.034100 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.137119 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.137179 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.137197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.137221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.137239 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.167028 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:56 crc kubenswrapper[4732]: E1010 06:51:56.167253 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:56 crc kubenswrapper[4732]: E1010 06:51:56.167349 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:00.167326091 +0000 UTC m=+47.236917372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.240422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.240497 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.240521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.240551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.240573 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.344120 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.344178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.344197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.344220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.344237 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.447680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.447776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.447796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.447824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.447841 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.550799 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.551077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.551091 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.551106 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.551117 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.653889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.653947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.653964 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.653986 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.654026 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.659564 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.659759 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.659776 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:56 crc kubenswrapper[4732]: E1010 06:51:56.660272 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:56 crc kubenswrapper[4732]: E1010 06:51:56.661276 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:51:56 crc kubenswrapper[4732]: E1010 06:51:56.661536 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.756869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.756920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.756937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.756963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.756980 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.859434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.859774 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.859953 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.860153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.860371 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.963889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.963961 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.963985 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.964012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:56 crc kubenswrapper[4732]: I1010 06:51:56.964035 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:56Z","lastTransitionTime":"2025-10-10T06:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.067031 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.067107 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.067124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.067153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.067172 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.170323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.170403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.170427 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.170459 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.170478 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.273348 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.273427 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.273457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.273493 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.273517 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.376933 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.377019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.377040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.377068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.377086 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.480877 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.480944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.480970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.480996 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.481012 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.584017 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.584067 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.584084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.584108 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.584126 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.660087 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:57 crc kubenswrapper[4732]: E1010 06:51:57.660266 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.686976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.687051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.687076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.687110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.687133 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.790140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.790192 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.790215 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.790245 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.790264 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.894019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.894071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.894094 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.894123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.894144 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.997001 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.997064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.997088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.997121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:57 crc kubenswrapper[4732]: I1010 06:51:57.997143 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:57Z","lastTransitionTime":"2025-10-10T06:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.100475 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.100538 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.100560 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.100588 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.100610 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.203356 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.203492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.203522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.203558 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.203582 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.306576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.306645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.306668 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.306748 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.306774 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.410080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.410136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.410144 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.410158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.410166 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.513069 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.513136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.513160 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.513187 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.513209 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.616612 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.616664 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.616681 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.616734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.616753 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.660027 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.660081 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:51:58 crc kubenswrapper[4732]: E1010 06:51:58.660531 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.660119 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:51:58 crc kubenswrapper[4732]: E1010 06:51:58.660670 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:51:58 crc kubenswrapper[4732]: E1010 06:51:58.661031 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.719779 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.719817 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.719828 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.719845 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.719855 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.825887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.825923 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.825931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.825944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.825952 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.928799 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.928873 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.928883 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.928899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:58 crc kubenswrapper[4732]: I1010 06:51:58.928938 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:58Z","lastTransitionTime":"2025-10-10T06:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.031469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.031558 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.031570 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.031611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.031622 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.133831 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.133891 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.133908 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.133927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.133939 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.236372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.236422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.236431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.236446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.236458 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.340384 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.340454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.340467 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.340491 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.340506 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.443058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.443091 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.443106 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.443123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.443134 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.453500 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.453574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.453597 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.453626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.453648 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: E1010 06:51:59.475834 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:59Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.481076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.481149 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.481161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.481185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.481198 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: E1010 06:51:59.495076 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:59Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.500252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.500297 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.500309 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.500328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.500339 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: E1010 06:51:59.512681 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:59Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.516389 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.516420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.516428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.516444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.516455 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: E1010 06:51:59.533896 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:59Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.538439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.538474 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.538485 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.538503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.538515 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: E1010 06:51:59.551750 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:59Z is after 2025-08-24T17:21:41Z" Oct 10 06:51:59 crc kubenswrapper[4732]: E1010 06:51:59.551894 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.553705 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.553743 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.553754 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.553769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.553778 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.656925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.656967 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.656978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.656998 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.657009 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.659209 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:51:59 crc kubenswrapper[4732]: E1010 06:51:59.659296 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.759817 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.759855 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.759866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.759882 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.759894 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.862640 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.862679 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.862707 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.862731 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.862744 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.964957 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.965000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.965012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.965027 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:51:59 crc kubenswrapper[4732]: I1010 06:51:59.965062 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:51:59Z","lastTransitionTime":"2025-10-10T06:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.068501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.069256 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.069314 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.069335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.069349 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.172405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.172462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.172475 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.172491 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.172508 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.212272 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:00 crc kubenswrapper[4732]: E1010 06:52:00.212471 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:00 crc kubenswrapper[4732]: E1010 06:52:00.212567 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:08.212545506 +0000 UTC m=+55.282136747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.275985 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.276077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.276096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.276124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.276144 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.379090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.379134 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.379143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.379156 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.379167 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.483117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.483189 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.483209 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.483235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.483252 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.586827 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.586889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.586899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.586919 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.586929 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.660320 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.660437 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:00 crc kubenswrapper[4732]: E1010 06:52:00.660519 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.660465 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:00 crc kubenswrapper[4732]: E1010 06:52:00.660668 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:00 crc kubenswrapper[4732]: E1010 06:52:00.660824 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.690315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.690424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.690452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.690490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.690516 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.793866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.793951 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.793987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.794019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.794040 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.898090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.898137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.898613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.898639 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:00 crc kubenswrapper[4732]: I1010 06:52:00.898650 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:00Z","lastTransitionTime":"2025-10-10T06:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.000745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.000798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.000811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.000828 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.000841 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.103869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.103912 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.103920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.103937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.103948 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.207013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.207072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.207086 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.207105 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.207116 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.310250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.310300 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.310316 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.310334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.310348 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.413633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.413741 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.413762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.413794 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.413816 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.517208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.517294 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.517318 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.517352 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.517376 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.620198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.620246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.620263 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.620286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.620303 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.659655 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:01 crc kubenswrapper[4732]: E1010 06:52:01.659852 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.723444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.723524 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.723545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.723575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.723595 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.827173 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.827277 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.827299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.827328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.827348 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.930742 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.930812 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.930837 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.930866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:01 crc kubenswrapper[4732]: I1010 06:52:01.930892 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:01Z","lastTransitionTime":"2025-10-10T06:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.034392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.034463 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.034487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.034516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.034538 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.137232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.137280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.137299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.137320 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.137404 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.240008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.240075 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.240099 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.240131 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.240154 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.343089 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.343146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.343165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.343187 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.343203 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.446171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.446213 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.446225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.446242 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.446255 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.549340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.549400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.549412 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.549430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.549442 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.652046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.652116 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.652142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.652166 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.652182 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.659430 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.659479 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:02 crc kubenswrapper[4732]: E1010 06:52:02.659571 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:02 crc kubenswrapper[4732]: E1010 06:52:02.659749 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.659436 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:02 crc kubenswrapper[4732]: E1010 06:52:02.659903 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.755678 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.755989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.756073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.756157 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.756441 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.859484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.859539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.859556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.859578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.859597 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.963782 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.963872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.963897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.963929 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:02 crc kubenswrapper[4732]: I1010 06:52:02.963951 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:02Z","lastTransitionTime":"2025-10-10T06:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.067043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.067164 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.067188 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.067214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.067236 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.170861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.171297 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.171507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.171672 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.171963 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.275545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.275599 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.275617 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.275640 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.275658 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.379058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.379121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.379133 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.379158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.379171 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.482194 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.482331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.482345 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.482361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.482373 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.586029 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.586084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.586102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.586124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.586140 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.659982 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:03 crc kubenswrapper[4732]: E1010 06:52:03.660215 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.661503 4732 scope.go:117] "RemoveContainer" containerID="3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.688926 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.689675 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.689761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.689780 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.689805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.689822 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.708835 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.727071 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.759467 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.777997 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.794420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.794461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.794638 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.794870 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.794897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.794910 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.811073 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.824414 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.841475 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.855301 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.868660 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.878233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.889665 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.896868 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.896895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.896902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.896918 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.896927 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.906838 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.923572 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.937057 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.949402 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.977431 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/1.log" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.980244 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb"} Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.980727 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.994547 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:03Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.999386 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.999428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.999443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.999466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:03 crc kubenswrapper[4732]: I1010 06:52:03.999485 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:03Z","lastTransitionTime":"2025-10-10T06:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.010161 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.030366 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.056996 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.072219 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.087244 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.101417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.101463 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.101474 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.101490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.101502 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.110390 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.127896 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.147834 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.166134 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.178788 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.191847 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.204237 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.204284 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.204298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.204315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.204328 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.205864 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.221611 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.234301 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.250600 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.265351 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.306611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.306657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.306665 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.306681 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.306703 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.413927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.413986 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.414004 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.414028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.414045 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.460246 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.460386 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.460465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.460591 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.460594 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.460669 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:36.460647688 +0000 UTC m=+83.530238969 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.460728 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:36.460682079 +0000 UTC m=+83.530273360 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.461123 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:52:36.461099051 +0000 UTC m=+83.530690332 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.517642 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.517728 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.517746 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.517770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.517787 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.561618 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.561729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.561902 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.561902 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.561955 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.561973 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.562051 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:36.562028195 +0000 UTC m=+83.631619456 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.561928 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.562094 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.562160 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:36.562138648 +0000 UTC m=+83.631729899 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.620142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.620187 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.620199 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.620214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.620226 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.660009 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.660131 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.660445 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.660507 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.660556 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.660598 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.723159 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.723206 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.723221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.723242 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.723258 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.826528 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.826579 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.826593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.826614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.826629 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.929565 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.929605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.929615 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.929627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.929636 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:04Z","lastTransitionTime":"2025-10-10T06:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.985667 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/2.log" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.986401 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/1.log" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.988676 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb" exitCode=1 Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.988724 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb"} Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.988775 4732 scope.go:117] "RemoveContainer" containerID="3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e" Oct 10 06:52:04 crc kubenswrapper[4732]: I1010 06:52:04.989435 4732 scope.go:117] "RemoveContainer" containerID="a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb" Oct 10 06:52:04 crc kubenswrapper[4732]: E1010 06:52:04.989617 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.000917 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:04Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.013779 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.032957 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.032996 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.033007 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.033022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.033031 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.031873 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b57d7fac725fdf6960e15b4c9a84c210d65c4c0ee7d69ae96e89ca73d91b04e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:51:49Z\\\",\\\"message\\\":\\\"snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1010 06:51:49.762263 6162 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:51:49Z is after 2025-08-24T17:21:41Z]\\\\nI1010 06:51:49.762724 6162 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.053539 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.066210 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.079141 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.090750 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.102308 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.116109 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.128731 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.136009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.136067 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.136079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.136100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.136112 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.141072 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.151215 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.165533 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.180354 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.193648 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.205786 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.217534 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:05Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.238938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.239013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.239040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.239071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.239088 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.342242 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.342296 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.342310 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.342329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.342341 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.444994 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.445028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.445036 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.445049 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.445058 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.548417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.548499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.548523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.548564 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.548586 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.651873 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.651937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.651962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.651992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.652013 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.659192 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:05 crc kubenswrapper[4732]: E1010 06:52:05.659372 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.759316 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.759375 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.759394 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.759428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.759446 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.863935 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.864000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.864021 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.864048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.864068 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.967469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.967527 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.967542 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.967562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.967576 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:05Z","lastTransitionTime":"2025-10-10T06:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.993274 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/2.log" Oct 10 06:52:05 crc kubenswrapper[4732]: I1010 06:52:05.997305 4732 scope.go:117] "RemoveContainer" containerID="a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb" Oct 10 06:52:05 crc kubenswrapper[4732]: E1010 06:52:05.997450 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.016424 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.030600 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.044352 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.055064 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.070197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.070238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.070250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.070280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.070294 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.079661 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.094437 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.109773 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.127259 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.140836 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.157038 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.169335 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.173210 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.173344 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.173418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.173480 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.173547 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.181049 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.196187 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.209950 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.221805 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.232747 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.243428 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:06Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.276204 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.276244 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.276255 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.276273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.276286 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.379977 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.380018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.380027 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.380047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.380059 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.482963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.483393 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.483546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.483744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.483953 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.588181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.588510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.588643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.588833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.589044 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.660068 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.660068 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:06 crc kubenswrapper[4732]: E1010 06:52:06.660293 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:06 crc kubenswrapper[4732]: E1010 06:52:06.660363 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.660100 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:06 crc kubenswrapper[4732]: E1010 06:52:06.660486 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.692658 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.692770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.692796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.692818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.692835 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.795891 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.795957 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.795980 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.796008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.796029 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.899265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.899638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.899866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.900088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:06 crc kubenswrapper[4732]: I1010 06:52:06.900294 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:06Z","lastTransitionTime":"2025-10-10T06:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.005818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.005855 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.005865 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.005880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.005891 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.111615 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.111670 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.111684 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.111729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.111748 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.214903 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.214954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.214965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.214982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.214995 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.317992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.318045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.318066 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.318095 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.318117 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.419989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.420853 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.420878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.420894 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.420904 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.523099 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.523127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.523137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.523149 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.523157 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.625549 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.625575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.625583 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.625598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.625609 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.660226 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:07 crc kubenswrapper[4732]: E1010 06:52:07.660366 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.728183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.728466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.728545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.728665 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.728859 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.830890 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.830927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.830939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.830957 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.830971 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.934332 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.934775 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.935047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.935237 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:07 crc kubenswrapper[4732]: I1010 06:52:07.935403 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:07Z","lastTransitionTime":"2025-10-10T06:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.037571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.037621 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.037638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.037660 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.037677 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.140875 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.140987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.141015 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.141047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.141071 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.244588 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.244658 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.244684 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.244772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.244816 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.302055 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:08 crc kubenswrapper[4732]: E1010 06:52:08.302276 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:08 crc kubenswrapper[4732]: E1010 06:52:08.302349 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:24.30232117 +0000 UTC m=+71.371912441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.347844 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.348212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.348386 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.348535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.348739 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.452574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.452679 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.452744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.452778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.452801 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.555581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.555632 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.555648 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.555672 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.555909 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.658128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.658164 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.658172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.658185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.658194 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.659484 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.659600 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.659841 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:08 crc kubenswrapper[4732]: E1010 06:52:08.659994 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:08 crc kubenswrapper[4732]: E1010 06:52:08.660288 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:08 crc kubenswrapper[4732]: E1010 06:52:08.660579 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.761834 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.762205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.762456 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.762779 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.763045 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.866487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.866557 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.866571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.866591 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.866604 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.969535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.969574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.969587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.969603 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:08 crc kubenswrapper[4732]: I1010 06:52:08.969614 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:08Z","lastTransitionTime":"2025-10-10T06:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.021324 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.033282 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.034502 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.050326 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.063330 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.071812 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.071843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.071852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.071868 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.071877 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.076855 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.089008 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.113325 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.130801 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.149971 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.163131 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.176177 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.176219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.176231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.176245 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.176254 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.177119 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.189819 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.203899 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.219911 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.233030 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.244545 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.258140 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.268404 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.279394 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.279438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.279447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.279462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.279471 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.382002 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.382275 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.382360 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.382447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.382530 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.484844 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.484881 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.484893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.484907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.484918 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.587198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.587455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.587539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.587623 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.587723 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.660197 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:09 crc kubenswrapper[4732]: E1010 06:52:09.660311 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.690205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.690429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.690503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.690587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.690658 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.792504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.792531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.792539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.792552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.792560 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.894969 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.895011 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.895022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.895039 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.895050 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.924638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.924716 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.924727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.924741 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.924750 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: E1010 06:52:09.936960 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.939609 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.939644 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.939655 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.939670 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.939682 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: E1010 06:52:09.951163 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.954942 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.954976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.954987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.955001 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.955010 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: E1010 06:52:09.967021 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.970654 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.970737 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.970751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.970771 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.970785 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: E1010 06:52:09.980962 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.984416 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.984448 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.984457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.984470 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.984480 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:09 crc kubenswrapper[4732]: E1010 06:52:09.994745 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:09Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:09 crc kubenswrapper[4732]: E1010 06:52:09.994884 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.997093 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.997122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.997131 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.997145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:09 crc kubenswrapper[4732]: I1010 06:52:09.997153 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:09Z","lastTransitionTime":"2025-10-10T06:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.099407 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.099452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.099463 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.099480 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.099492 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.201931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.201988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.202014 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.202043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.202066 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.304110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.304160 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.304170 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.304183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.304192 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.407315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.407363 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.407373 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.407388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.407399 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.510399 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.510535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.510561 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.510590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.510611 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.614601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.614656 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.614667 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.614684 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.614719 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.659564 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.659647 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.659822 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:10 crc kubenswrapper[4732]: E1010 06:52:10.660082 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:10 crc kubenswrapper[4732]: E1010 06:52:10.660251 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:10 crc kubenswrapper[4732]: E1010 06:52:10.660365 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.718257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.718324 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.718342 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.718369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.718386 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.820572 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.820624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.820641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.820663 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.820679 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.923147 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.923194 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.923205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.923224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:10 crc kubenswrapper[4732]: I1010 06:52:10.923238 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:10Z","lastTransitionTime":"2025-10-10T06:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.025299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.025561 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.025626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.025733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.025851 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.128080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.128118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.128127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.128140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.128149 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.230930 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.231241 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.231636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.231934 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.232188 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.335436 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.335756 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.335866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.335953 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.336031 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.438654 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.438749 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.438774 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.438805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.438827 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.541101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.541142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.541154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.541168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.541176 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.642951 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.643006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.643025 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.643046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.643062 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.659236 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:11 crc kubenswrapper[4732]: E1010 06:52:11.659449 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.745519 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.745580 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.745600 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.745623 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.745640 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.848084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.848129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.848140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.848157 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.848168 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.950742 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.950781 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.950792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.950808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:11 crc kubenswrapper[4732]: I1010 06:52:11.950820 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:11Z","lastTransitionTime":"2025-10-10T06:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.054326 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.054379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.054401 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.054429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.054450 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.157506 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.157548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.157556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.157571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.157583 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.260958 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.261021 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.261038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.261066 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.261084 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.363719 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.363762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.363777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.363793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.363801 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.466739 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.466928 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.466981 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.467008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.467031 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.575440 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.575502 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.575521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.575547 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.575565 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.660095 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:12 crc kubenswrapper[4732]: E1010 06:52:12.660250 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.660536 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:12 crc kubenswrapper[4732]: E1010 06:52:12.660896 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.660913 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:12 crc kubenswrapper[4732]: E1010 06:52:12.661361 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.677485 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.677515 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.677522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.677536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.677544 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.781213 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.781266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.781286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.781311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.781330 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.883387 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.883414 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.883421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.883434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.883442 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.985359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.985408 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.985420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.985443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:12 crc kubenswrapper[4732]: I1010 06:52:12.985454 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:12Z","lastTransitionTime":"2025-10-10T06:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.088282 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.088359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.088376 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.088393 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.088405 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.190667 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.190725 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.190736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.190753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.190763 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.293250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.293284 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.293292 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.293308 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.293319 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.395469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.395499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.395508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.395521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.395530 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.499388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.499473 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.499493 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.499516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.499534 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.602578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.603252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.603274 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.603293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.603308 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.659808 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:13 crc kubenswrapper[4732]: E1010 06:52:13.659969 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.679683 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.698554 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.712766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.712819 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.712836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.712858 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.712872 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.719376 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.737235 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.754782 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.767206 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.779078 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.792576 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.804376 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.815800 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.815843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.815856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.815875 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.815890 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.817421 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.826872 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.840540 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.850514 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.863093 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.875686 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.891800 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.908066 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.917762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.917793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.917806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.917820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.917894 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:13Z","lastTransitionTime":"2025-10-10T06:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:13 crc kubenswrapper[4732]: I1010 06:52:13.919135 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:13Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.020003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.020056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.020071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.020090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.020102 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.122508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.122568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.122578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.122590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.122600 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.225645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.225680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.225711 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.225726 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.225738 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.329440 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.329518 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.329542 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.329569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.329588 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.432464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.432578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.432643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.432671 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.432757 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.535732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.535814 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.535840 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.535873 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.535897 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.639295 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.639382 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.639400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.639417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.639428 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.659954 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.659983 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.659899 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:14 crc kubenswrapper[4732]: E1010 06:52:14.660074 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:14 crc kubenswrapper[4732]: E1010 06:52:14.660199 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:14 crc kubenswrapper[4732]: E1010 06:52:14.660326 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.742384 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.742443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.742464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.742495 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.742516 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.844747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.844817 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.844836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.844860 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.844891 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.948594 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.948709 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.948729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.948749 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:14 crc kubenswrapper[4732]: I1010 06:52:14.948763 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:14Z","lastTransitionTime":"2025-10-10T06:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.051151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.051203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.051216 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.051236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.051251 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.153319 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.153378 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.153395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.153417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.153435 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.256055 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.256122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.256143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.256171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.256192 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.359112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.359195 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.359219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.359249 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.359272 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.462988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.463037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.463056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.463079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.463096 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.565585 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.565630 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.565643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.565664 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.565676 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.660093 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:15 crc kubenswrapper[4732]: E1010 06:52:15.660220 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.668331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.668397 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.668417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.668447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.668467 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.770893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.770943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.770955 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.770970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.770982 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.876764 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.876817 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.876833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.876854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.876870 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.980266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.980357 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.980371 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.980398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:15 crc kubenswrapper[4732]: I1010 06:52:15.980410 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:15Z","lastTransitionTime":"2025-10-10T06:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.083026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.083070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.083081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.083098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.083113 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.185340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.185392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.185402 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.185421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.185437 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.287586 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.287646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.287657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.287681 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.287720 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.390243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.390307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.390329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.390358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.390379 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.492543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.492596 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.492613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.492636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.492652 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.594893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.594939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.594950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.594965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.594978 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.660066 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.660080 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:16 crc kubenswrapper[4732]: E1010 06:52:16.660276 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.660080 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:16 crc kubenswrapper[4732]: E1010 06:52:16.660465 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:16 crc kubenswrapper[4732]: E1010 06:52:16.660488 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.697779 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.697834 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.697850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.697872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.697891 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.800668 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.800724 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.800733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.800745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.800754 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.903477 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.903556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.903572 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.903610 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:16 crc kubenswrapper[4732]: I1010 06:52:16.903630 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:16Z","lastTransitionTime":"2025-10-10T06:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.006296 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.006331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.006341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.006358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.006369 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.108639 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.108682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.108736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.108754 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.108766 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.211019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.211064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.211076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.211090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.211100 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.312948 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.313058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.313076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.313097 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.313113 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.416114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.416166 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.416182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.416205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.416221 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.519460 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.519564 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.519589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.519620 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.519644 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.622400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.622437 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.622447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.622460 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.622470 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.660132 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:17 crc kubenswrapper[4732]: E1010 06:52:17.660292 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.726212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.726273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.726295 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.726325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.726346 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.829117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.829162 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.829174 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.829190 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.829201 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.931570 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.931608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.931616 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.931629 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:17 crc kubenswrapper[4732]: I1010 06:52:17.931638 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:17Z","lastTransitionTime":"2025-10-10T06:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.034508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.034586 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.034610 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.034641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.034665 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.137772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.137834 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.137848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.137864 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.137877 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.240023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.240327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.240438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.240530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.240616 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.343469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.343758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.343891 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.344002 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.344088 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.446523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.446564 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.446575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.446592 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.446604 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.548754 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.548785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.548793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.548809 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.548817 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.651235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.651280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.651289 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.651303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.651314 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.659829 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.659872 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.659900 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:18 crc kubenswrapper[4732]: E1010 06:52:18.659965 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:18 crc kubenswrapper[4732]: E1010 06:52:18.660116 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:18 crc kubenswrapper[4732]: E1010 06:52:18.660409 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.672416 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.753501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.753554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.753568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.753586 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.753597 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.856511 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.856558 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.856571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.856589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.856602 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.958556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.958605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.958614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.958632 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:18 crc kubenswrapper[4732]: I1010 06:52:18.958641 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:18Z","lastTransitionTime":"2025-10-10T06:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.065826 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.065883 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.065899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.065923 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.065941 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.169110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.169197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.169211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.169233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.169247 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.272800 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.272859 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.272871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.272902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.272921 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.375580 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.375642 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.375651 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.375734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.375750 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.478519 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.478563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.478576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.478593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.478605 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.580945 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.580986 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.580998 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.581014 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.581025 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.660205 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:19 crc kubenswrapper[4732]: E1010 06:52:19.660493 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.683845 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.683887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.683900 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.683916 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.683939 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.786281 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.786327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.786338 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.786354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.786365 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.889033 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.889062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.889072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.889088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.889098 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.991466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.991505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.991513 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.991527 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:19 crc kubenswrapper[4732]: I1010 06:52:19.991537 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:19Z","lastTransitionTime":"2025-10-10T06:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.044135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.044185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.044196 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.044211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.044222 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.057914 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:20Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.061667 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.061715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.061723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.061737 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.061746 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.071645 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:20Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.075142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.075281 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.075368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.075496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.075582 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.086002 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:20Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.089675 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.089735 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.089745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.089762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.089772 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.102457 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:20Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.106590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.106627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.106636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.106651 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.106660 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.119046 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:20Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.119225 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.120556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.120586 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.120599 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.120617 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.120629 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.223053 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.223104 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.223120 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.223142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.223158 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.325595 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.325658 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.325680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.325754 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.325776 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.427804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.427850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.427862 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.427878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.427889 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.530395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.530440 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.530453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.530472 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.530484 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.632719 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.632750 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.632758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.632772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.632780 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.659270 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.659326 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.659270 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.659424 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.659519 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:20 crc kubenswrapper[4732]: E1010 06:52:20.659567 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.734950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.735479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.735548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.735608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.735662 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.838836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.838875 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.838887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.838905 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.838920 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.941743 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.941777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.941787 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.941805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:20 crc kubenswrapper[4732]: I1010 06:52:20.941816 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:20Z","lastTransitionTime":"2025-10-10T06:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.044863 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.044913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.044925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.044947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.044963 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.148745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.149132 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.149218 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.149317 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.149413 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.252788 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.252832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.252843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.252861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.252873 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.356129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.356174 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.356185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.356202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.356214 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.459177 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.459234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.459248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.459271 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.459287 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.562350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.562400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.562412 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.562431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.562443 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.659924 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:21 crc kubenswrapper[4732]: E1010 06:52:21.660071 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.661255 4732 scope.go:117] "RemoveContainer" containerID="a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb" Oct 10 06:52:21 crc kubenswrapper[4732]: E1010 06:52:21.661496 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.664734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.664816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.664832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.664853 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.664867 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.767613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.767668 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.767681 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.767722 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.767738 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.871102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.871147 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.871172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.871190 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.871203 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.973977 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.974020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.974032 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.974048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:21 crc kubenswrapper[4732]: I1010 06:52:21.974060 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:21Z","lastTransitionTime":"2025-10-10T06:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.075792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.075841 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.075853 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.075871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.075882 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.178710 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.178755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.178765 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.178778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.178788 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.280759 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.280805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.280817 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.280834 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.280846 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.383851 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.383898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.383912 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.383930 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.383942 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.486825 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.486920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.486941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.486968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.487028 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.591225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.591594 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.591772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.591938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.592107 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.659865 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.659869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.659869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:22 crc kubenswrapper[4732]: E1010 06:52:22.660464 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:22 crc kubenswrapper[4732]: E1010 06:52:22.660268 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:22 crc kubenswrapper[4732]: E1010 06:52:22.660615 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.694815 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.695112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.695222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.695327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.695415 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.798375 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.798438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.798449 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.798463 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.798475 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.900314 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.900355 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.900367 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.900382 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:22 crc kubenswrapper[4732]: I1010 06:52:22.900392 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:22Z","lastTransitionTime":"2025-10-10T06:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.003307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.003350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.003362 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.003379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.003390 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.106381 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.106418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.106428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.106442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.106452 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.209598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.209666 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.209729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.209763 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.209785 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.312953 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.313043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.313060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.313084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.313101 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.416041 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.416104 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.416127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.416161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.416181 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.518356 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.518414 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.518530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.518560 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.518588 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.622678 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.622730 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.622743 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.622760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.622772 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.660632 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:23 crc kubenswrapper[4732]: E1010 06:52:23.660784 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.682588 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.703432 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.716932 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.725251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.725308 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.725324 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.725345 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.725360 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.728676 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.748528 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.765464 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.780963 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.794409 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.804741 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.814834 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.825544 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.827942 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.827973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.827984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.827999 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.828011 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.837664 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.851975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.863851 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.883141 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.893802 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.905385 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.918893 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.929753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.929948 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.930145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.930340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.930515 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:23Z","lastTransitionTime":"2025-10-10T06:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:23 crc kubenswrapper[4732]: I1010 06:52:23.937758 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:23Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.033285 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.033323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.033334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.033349 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.033359 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.135640 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.135899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.135965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.136036 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.136099 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.238409 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.238479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.238489 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.238506 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.238519 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.341127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.341183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.341195 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.341215 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.341228 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.367767 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:24 crc kubenswrapper[4732]: E1010 06:52:24.367921 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:24 crc kubenswrapper[4732]: E1010 06:52:24.367978 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:52:56.367961867 +0000 UTC m=+103.437553108 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.443943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.444156 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.444265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.444345 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.444433 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.546984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.547049 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.547063 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.547100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.547109 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.650978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.651019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.651028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.651045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.651054 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.659465 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.659510 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:24 crc kubenswrapper[4732]: E1010 06:52:24.659596 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.659465 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:24 crc kubenswrapper[4732]: E1010 06:52:24.659795 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:24 crc kubenswrapper[4732]: E1010 06:52:24.659937 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.754464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.754513 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.754530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.754552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.754569 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.857581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.857627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.857636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.857650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.857659 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.959330 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.959369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.959379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.959399 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:24 crc kubenswrapper[4732]: I1010 06:52:24.959415 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:24Z","lastTransitionTime":"2025-10-10T06:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.063172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.063241 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.063265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.063293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.063313 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.165659 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.165716 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.165724 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.165737 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.165748 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.267928 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.267982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.267991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.268006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.268015 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.370069 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.370110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.370119 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.370132 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.370141 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.472462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.472517 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.472535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.472556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.472573 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.574643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.574736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.574769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.574793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.574809 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.660229 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:25 crc kubenswrapper[4732]: E1010 06:52:25.660419 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.676327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.676476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.676586 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.676679 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.676800 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.779083 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.779251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.779359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.779487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.779582 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.882456 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.882514 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.882536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.882567 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.882604 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.984854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.984905 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.984921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.984944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:25 crc kubenswrapper[4732]: I1010 06:52:25.984961 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:25Z","lastTransitionTime":"2025-10-10T06:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.060588 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/0.log" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.060645 4732 generic.go:334] "Generic (PLEG): container finished" podID="d94cc3c3-3cb6-4a5b-996b-90099415f9bf" containerID="8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef" exitCode=1 Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.060681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerDied","Data":"8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.061128 4732 scope.go:117] "RemoveContainer" containerID="8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.076175 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.088379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.088420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.088431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.088447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.088459 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.089795 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.101802 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.124759 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.154402 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.173433 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.186149 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.190642 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.190678 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.190706 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.190724 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.190735 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.199350 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.210706 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.221600 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.237658 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.250775 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.265759 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.276190 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.292706 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.292752 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.292766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.292785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.292797 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.297080 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.313417 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.326507 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.340908 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.359053 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:26Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.394425 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.394467 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.394478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.394492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.394504 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.496681 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.496746 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.496758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.496776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.496787 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.602632 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.602965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.603031 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.603113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.603169 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.659150 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:26 crc kubenswrapper[4732]: E1010 06:52:26.659308 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.659486 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.659620 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:26 crc kubenswrapper[4732]: E1010 06:52:26.659860 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:26 crc kubenswrapper[4732]: E1010 06:52:26.660306 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.706122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.706158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.706168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.706182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.706191 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.808854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.808901 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.808915 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.808932 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.808945 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.911827 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.911895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.911918 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.911947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:26 crc kubenswrapper[4732]: I1010 06:52:26.911967 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:26Z","lastTransitionTime":"2025-10-10T06:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.014537 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.014579 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.014591 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.014608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.014619 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.066299 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/0.log" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.066559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerStarted","Data":"c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.082662 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.098361 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.113947 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.118165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.118225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.118244 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.118268 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.118296 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.142168 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.164749 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.180087 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.200533 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.213285 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.219934 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.219963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.219974 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.219993 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.220005 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.226984 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.238759 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.250898 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.262154 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.275271 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.285716 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.296289 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.308972 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.321416 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.322938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.323050 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.323128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.323208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.323266 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.331081 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.342710 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:27Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.425655 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.425715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.425728 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.425745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.425757 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.527849 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.527913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.527931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.527953 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.527970 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.630655 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.630765 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.630793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.630818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.630837 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.659334 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:27 crc kubenswrapper[4732]: E1010 06:52:27.659519 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.733861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.733893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.733919 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.733935 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.733948 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.836543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.836591 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.836606 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.836626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.836641 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.938663 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.938727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.938740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.938757 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:27 crc kubenswrapper[4732]: I1010 06:52:27.938769 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:27Z","lastTransitionTime":"2025-10-10T06:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.040880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.040911 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.040919 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.040932 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.040940 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.143410 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.143444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.143454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.143468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.143480 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.246571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.246638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.246656 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.246682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.246741 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.349290 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.349362 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.349379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.349407 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.349428 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.453009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.453101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.453128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.453161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.453185 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.557077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.557124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.557141 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.557165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.557182 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.659204 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.659272 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:28 crc kubenswrapper[4732]: E1010 06:52:28.659388 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.659209 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:28 crc kubenswrapper[4732]: E1010 06:52:28.659620 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:28 crc kubenswrapper[4732]: E1010 06:52:28.659791 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.661442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.661496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.661512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.661533 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.661550 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.764969 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.765028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.765046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.765073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.765093 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.868980 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.869300 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.869429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.869566 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.869743 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.972656 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.972798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.972813 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.972831 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:28 crc kubenswrapper[4732]: I1010 06:52:28.972842 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:28Z","lastTransitionTime":"2025-10-10T06:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.075767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.075824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.075847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.075875 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.075898 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.178441 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.178477 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.178486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.178501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.178512 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.281524 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.281646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.281669 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.281732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.281755 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.385250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.385611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.385795 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.385991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.386134 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.489535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.489590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.489606 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.489629 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.489647 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.592766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.592811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.592828 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.592851 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.592869 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.659431 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:29 crc kubenswrapper[4732]: E1010 06:52:29.659619 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.695614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.695651 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.695738 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.695758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.695770 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.798937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.798991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.799047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.799073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.799091 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.901989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.902303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.902490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.902677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:29 crc kubenswrapper[4732]: I1010 06:52:29.902911 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:29Z","lastTransitionTime":"2025-10-10T06:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.006478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.006537 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.006547 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.006563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.006575 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.109660 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.109781 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.109810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.109837 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.109857 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.213264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.213315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.213331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.213355 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.213371 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.316341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.316415 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.316433 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.316456 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.316473 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.420121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.420189 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.420211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.420239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.420264 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.452013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.452062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.452078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.452098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.452114 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.467235 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:30Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.472675 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.472870 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.472939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.472963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.472976 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.484878 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:30Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.488318 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.488389 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.488435 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.488457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.488468 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.505525 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:30Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.508655 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.508680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.508706 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.508723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.508733 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.519944 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:30Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.523717 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.523759 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.523770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.523786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.523798 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.537289 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:30Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.537515 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.539225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.539255 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.539263 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.539275 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.539301 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.641523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.641559 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.641567 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.641581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.641589 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.659153 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.659267 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.659294 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.659351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.659456 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:30 crc kubenswrapper[4732]: E1010 06:52:30.659499 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.743882 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.743960 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.743986 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.744020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.744042 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.846015 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.846057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.846072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.846090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.846105 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.949509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.949542 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.949551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.949582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:30 crc kubenswrapper[4732]: I1010 06:52:30.949591 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:30Z","lastTransitionTime":"2025-10-10T06:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.052444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.052505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.052522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.052547 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.052565 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.155612 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.155667 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.155683 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.155727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.155751 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.258988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.259059 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.259084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.259114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.259136 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.362010 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.362061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.362077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.362099 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.362116 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.465808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.465966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.465992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.466070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.466103 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.569048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.569111 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.569129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.569155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.569174 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.660152 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:31 crc kubenswrapper[4732]: E1010 06:52:31.660320 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.672928 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.673101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.673155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.673177 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.673194 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.776613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.776726 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.776744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.776767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.776783 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.880408 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.880512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.880530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.880555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.880573 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.984116 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.984181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.984198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.984225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:31 crc kubenswrapper[4732]: I1010 06:52:31.984242 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:31Z","lastTransitionTime":"2025-10-10T06:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.087582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.087645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.087665 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.087716 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.087735 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.191222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.191309 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.191363 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.191389 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.191408 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.294551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.294617 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.294628 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.294643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.294657 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.397579 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.397636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.397654 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.397684 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.397729 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.500816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.500886 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.500903 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.500923 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.500934 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.604026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.604079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.604092 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.604108 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.604121 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.660136 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.660136 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:32 crc kubenswrapper[4732]: E1010 06:52:32.660276 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:32 crc kubenswrapper[4732]: E1010 06:52:32.660355 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.660156 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:32 crc kubenswrapper[4732]: E1010 06:52:32.660490 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.709305 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.709418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.709468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.709505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.709530 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.811577 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.811604 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.811611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.811624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.811632 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.914184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.914224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.914235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.914251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:32 crc kubenswrapper[4732]: I1010 06:52:32.914262 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:32Z","lastTransitionTime":"2025-10-10T06:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.016676 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.016744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.016758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.016776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.016786 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.119380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.119434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.119445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.119464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.119477 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.221763 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.221818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.221833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.221854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.221870 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.324741 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.324780 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.324789 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.324803 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.324812 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.427769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.428134 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.428270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.428384 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.428474 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.530630 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.530997 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.531126 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.531266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.531356 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.634153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.634502 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.634750 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.635006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.635224 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.659645 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:33 crc kubenswrapper[4732]: E1010 06:52:33.659770 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.661415 4732 scope.go:117] "RemoveContainer" containerID="a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.676279 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.692158 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.707167 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.726007 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.738127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.738410 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.738534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.738852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.739043 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.741936 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.752622 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.770386 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.790505 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.809979 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.823390 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.843812 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.845584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.845753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.847107 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.847373 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.847815 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.856481 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.871937 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.888888 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.914961 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.928724 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.945576 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.950965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.951013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.951022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.951036 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.951046 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:33Z","lastTransitionTime":"2025-10-10T06:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.962240 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:33 crc kubenswrapper[4732]: I1010 06:52:33.973664 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:33Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.053507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.053552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.053561 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.053573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.053582 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.095688 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/2.log" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.098830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.100157 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.122904 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.138441 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.150270 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.156061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.156100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.156108 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.156139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.156148 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.163101 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.182318 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.195190 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.210345 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.220919 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.231065 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.243162 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.253127 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.257979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.258045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.258057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.258072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.258082 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.268192 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.282093 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.294196 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.302471 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.314611 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.327985 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.340444 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.350132 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:34Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.360595 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.360618 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.360625 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.360638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.360647 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.463060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.463114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.463129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.463151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.463169 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.565622 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.565715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.565731 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.565748 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.565788 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.660132 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.660158 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.660158 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:34 crc kubenswrapper[4732]: E1010 06:52:34.660297 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:34 crc kubenswrapper[4732]: E1010 06:52:34.660486 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:34 crc kubenswrapper[4732]: E1010 06:52:34.660593 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.667889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.667958 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.667982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.668040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.668063 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.771186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.771253 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.771277 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.771306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.771329 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.875182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.875226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.875241 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.875260 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.875273 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.978333 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.978404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.978429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.978458 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:34 crc kubenswrapper[4732]: I1010 06:52:34.978478 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:34Z","lastTransitionTime":"2025-10-10T06:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.082409 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.082462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.082478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.082500 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.082517 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.104471 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/3.log" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.105449 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/2.log" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.110344 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" exitCode=1 Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.110416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.110458 4732 scope.go:117] "RemoveContainer" containerID="a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.113533 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 06:52:35 crc kubenswrapper[4732]: E1010 06:52:35.114020 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.133019 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.153123 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.166285 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.179158 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.184322 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.184361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.184376 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.184395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.184410 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.196675 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.209318 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.220501 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.232091 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.252943 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.268452 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.284372 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.286947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.286987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.287000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.287019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.287033 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.299939 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.314072 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.329959 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.347769 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.375733 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8aab97cffa232f9bc9c214efe2ae805b6bc6c13d49db0b531b78e31f33163bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:04Z\\\",\\\"message\\\":\\\"il\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1010 06:52:04.576795 6376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1010 06:52:04.576804 6376 ovnkube.go:599] Stopped ovnkube\\\\nI1010 06:52:04.576869 6376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1010 06:52:04.577016 6376 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:34Z\\\",\\\"message\\\":\\\"lude.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007b72dcf \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1010 06:52:34.478441 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w\\\\nI1010 06:52:34.478446 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.389101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.389401 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.389535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.389664 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.389795 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.393821 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.406481 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.426232 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:35Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.493003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.493051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.493063 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.493083 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.493096 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.595999 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.596030 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.596041 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.596056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.596066 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.659499 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:35 crc kubenswrapper[4732]: E1010 06:52:35.659752 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.698814 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.698864 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.698880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.698901 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.698918 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.800893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.800936 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.800948 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.800965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.800976 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.903531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.903575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.903587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.903605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:35 crc kubenswrapper[4732]: I1010 06:52:35.903618 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:35Z","lastTransitionTime":"2025-10-10T06:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.006083 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.006131 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.006146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.006165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.006181 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.109425 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.109476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.109489 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.109507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.109519 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.114613 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/3.log" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.124815 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.125027 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.139609 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.158877 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.177390 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.189435 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.203226 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.211685 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.211740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.211751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.211769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.211783 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.215876 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.226478 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.237355 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.250082 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.266604 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.279712 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.296593 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.310944 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.314666 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.314750 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.314772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.314796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.314811 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.358421 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.373214 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.392032 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:34Z\\\",\\\"message\\\":\\\"lude.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007b72dcf \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1010 06:52:34.478441 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w\\\\nI1010 06:52:34.478446 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.418494 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.418587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.418602 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.418619 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.418652 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.421070 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.434279 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.453305 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:36Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.495306 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.495383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.495421 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.495516 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.495617 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:40.495577978 +0000 UTC m=+147.565169249 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.495704 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:53:40.495670361 +0000 UTC m=+147.565261722 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.495756 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.495949 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-10 06:53:40.495889177 +0000 UTC m=+147.565480448 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.521913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.521941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.521952 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.521967 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.521976 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.596112 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.596168 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596292 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596313 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596366 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596323 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596390 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596391 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596445 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-10 06:53:40.596428907 +0000 UTC m=+147.666020148 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.596459 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-10 06:53:40.596453947 +0000 UTC m=+147.666045188 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.624741 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.624810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.624834 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.624864 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.624884 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.659233 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.659261 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.659299 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.659389 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.659490 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:36 crc kubenswrapper[4732]: E1010 06:52:36.659658 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.727832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.727872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.727917 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.727942 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.727960 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.831325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.831394 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.831417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.831444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.831465 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.933453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.933488 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.933496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.933510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:36 crc kubenswrapper[4732]: I1010 06:52:36.933520 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:36Z","lastTransitionTime":"2025-10-10T06:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.037012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.037072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.037088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.037112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.037133 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.139323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.139778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.140510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.140553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.140570 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.243657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.243968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.244031 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.244092 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.244156 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.346712 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.346770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.346785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.346808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.346825 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.450239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.450300 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.450309 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.450326 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.450338 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.553281 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.553341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.553358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.553382 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.553399 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.656382 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.656652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.656936 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.657165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.657379 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.659923 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:37 crc kubenswrapper[4732]: E1010 06:52:37.660126 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.760168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.760214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.760225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.760243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.760255 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.863019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.863376 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.863578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.863760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.863917 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.968091 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.968481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.968662 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.968914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:37 crc kubenswrapper[4732]: I1010 06:52:37.969115 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:37Z","lastTransitionTime":"2025-10-10T06:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.073088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.073510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.073665 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.073878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.074034 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.177149 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.177191 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.177204 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.177220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.177231 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.279439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.279479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.279492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.279509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.279521 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.382836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.382907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.382929 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.382951 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.382971 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.486066 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.486128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.486145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.486168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.486195 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.589425 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.589575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.589601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.589631 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.589650 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.660220 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.660275 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.660305 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:38 crc kubenswrapper[4732]: E1010 06:52:38.660441 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:38 crc kubenswrapper[4732]: E1010 06:52:38.660513 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:38 crc kubenswrapper[4732]: E1010 06:52:38.660631 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.692778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.693074 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.693203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.693383 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.693519 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.797084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.797433 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.797569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.797733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.797892 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.902367 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.902455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.902481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.902515 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:38 crc kubenswrapper[4732]: I1010 06:52:38.902538 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:38Z","lastTransitionTime":"2025-10-10T06:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.005899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.005954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.005966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.005986 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.005998 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.108504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.108548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.108562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.108581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.108594 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.211072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.211110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.211120 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.211134 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.211146 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.314082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.314130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.314146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.314162 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.314177 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.416706 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.416766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.416776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.416788 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.416813 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.519452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.519501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.519519 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.519544 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.519564 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.623196 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.623246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.623263 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.623288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.623304 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.660251 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:39 crc kubenswrapper[4732]: E1010 06:52:39.660459 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.726356 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.726431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.726457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.726489 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.726516 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.830077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.830139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.830153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.830175 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.830190 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.932842 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.932886 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.932901 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.932922 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:39 crc kubenswrapper[4732]: I1010 06:52:39.932938 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:39Z","lastTransitionTime":"2025-10-10T06:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.035168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.035215 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.035229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.035248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.035262 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.137146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.137193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.137206 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.137224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.137238 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.240207 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.240836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.240931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.241020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.241139 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.343765 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.343811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.343827 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.343846 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.343858 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.446498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.446543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.446556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.446574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.446586 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.549189 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.549237 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.549249 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.549266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.549281 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.644656 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.644740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.644755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.644770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.644783 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.657818 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.660168 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.660347 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.660651 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.660773 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.660384 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.660843 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.662552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.662579 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.662588 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.662601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.662611 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.674177 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.677816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.677974 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.678086 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.678232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.678340 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.691966 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.695657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.695736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.695759 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.695781 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.695797 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.707740 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.711252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.711282 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.711292 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.711307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.711318 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.723936 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:40Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:40 crc kubenswrapper[4732]: E1010 06:52:40.724048 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.725222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.725248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.725258 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.725270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.725279 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.827512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.827554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.827563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.827578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.827586 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.930518 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.930789 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.930859 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.930920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:40 crc kubenswrapper[4732]: I1010 06:52:40.930987 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:40Z","lastTransitionTime":"2025-10-10T06:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.033283 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.033318 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.033332 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.033347 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.033360 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.136063 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.136099 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.136109 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.136123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.136133 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.238400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.238478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.238507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.238538 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.238563 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.341293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.341335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.341346 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.341363 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.341375 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.445992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.446039 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.446051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.446070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.446085 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.549548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.549595 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.549611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.549630 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.549644 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.653272 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.653303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.653312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.653325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.653334 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.660561 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:41 crc kubenswrapper[4732]: E1010 06:52:41.660680 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.756425 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.756484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.756503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.756526 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.756544 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.858836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.858874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.858887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.858904 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.858916 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.962151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.962185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.962194 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.962206 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:41 crc kubenswrapper[4732]: I1010 06:52:41.962215 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:41Z","lastTransitionTime":"2025-10-10T06:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.065812 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.065866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.065882 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.065902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.065918 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.169616 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.169677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.169714 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.169732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.169745 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.273374 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.273446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.273470 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.273501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.273526 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.376766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.376814 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.376824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.376840 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.376851 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.478947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.478979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.478987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.479000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.479009 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.581559 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.581608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.581624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.581644 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.581660 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.659871 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:42 crc kubenswrapper[4732]: E1010 06:52:42.659987 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.660157 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:42 crc kubenswrapper[4732]: E1010 06:52:42.660199 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.660294 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:42 crc kubenswrapper[4732]: E1010 06:52:42.660341 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.684021 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.684059 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.684071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.684090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.684103 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.786531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.786565 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.786573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.786585 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.786595 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.888780 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.888835 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.888846 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.888863 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.888875 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.992123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.992171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.992183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.992199 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:42 crc kubenswrapper[4732]: I1010 06:52:42.992210 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:42Z","lastTransitionTime":"2025-10-10T06:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.095369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.095479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.095504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.095534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.095579 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.199186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.199256 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.199279 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.199307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.199329 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.302775 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.302851 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.302876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.302906 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.302928 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.407535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.407612 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.407636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.407664 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.407734 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.510333 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.510406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.510442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.510468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.510488 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.613366 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.613450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.613479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.613510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.613533 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.659806 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:43 crc kubenswrapper[4732]: E1010 06:52:43.659977 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.679379 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.700218 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.716979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.717028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.717048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.717071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.717088 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.720802 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.741534 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.759015 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.777189 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.801419 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.816626 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.819576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.819643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.819657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.819676 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.819691 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.833685 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.847427 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.866751 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.881200 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.895566 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.911277 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.923723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.923758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.923770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.923795 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.923810 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:43Z","lastTransitionTime":"2025-10-10T06:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.935399 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:34Z\\\",\\\"message\\\":\\\"lude.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007b72dcf \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1010 06:52:34.478441 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w\\\\nI1010 06:52:34.478446 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.949406 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.960693 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.972067 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:43 crc kubenswrapper[4732]: I1010 06:52:43.983891 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:43Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.025862 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.025923 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.025939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.025962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.025978 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.127997 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.128035 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.128044 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.128058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.128067 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.230953 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.231009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.231024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.231073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.231091 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.333445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.333481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.333490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.333504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.333514 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.436483 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.436548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.436584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.436619 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.436639 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.541216 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.541250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.541259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.541275 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.541287 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.644167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.644229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.644240 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.644269 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.644285 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.659137 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.659220 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:44 crc kubenswrapper[4732]: E1010 06:52:44.659243 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.659346 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:44 crc kubenswrapper[4732]: E1010 06:52:44.659406 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:44 crc kubenswrapper[4732]: E1010 06:52:44.659581 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.748075 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.748123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.748132 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.748148 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.748158 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.851509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.851571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.851589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.851613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.851633 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.954843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.954871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.954880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.954893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:44 crc kubenswrapper[4732]: I1010 06:52:44.954901 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:44Z","lastTransitionTime":"2025-10-10T06:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.058045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.058098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.058113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.058129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.058140 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.161019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.161076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.161093 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.161116 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.161135 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.264052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.264098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.264114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.264136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.264152 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.367450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.367518 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.367536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.367560 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.367577 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.470646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.470815 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.470849 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.470872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.470888 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.573810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.573870 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.573890 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.573911 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.573925 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.660282 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:45 crc kubenswrapper[4732]: E1010 06:52:45.660471 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.677459 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.677524 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.677537 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.677554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.677569 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.781332 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.781418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.781452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.781474 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.781486 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.884287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.884351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.884367 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.884390 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.884409 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.988602 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.988717 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.988733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.988760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:45 crc kubenswrapper[4732]: I1010 06:52:45.988784 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:45Z","lastTransitionTime":"2025-10-10T06:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.092085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.092170 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.092182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.092222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.092238 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.194748 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.194782 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.194790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.194804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.194813 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.298234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.298311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.298335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.298369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.298411 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.401271 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.401344 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.401364 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.401441 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.401468 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.503675 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.503854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.503888 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.503959 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.503984 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.607453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.607549 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.607644 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.607833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.607861 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.660247 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.660284 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:46 crc kubenswrapper[4732]: E1010 06:52:46.660438 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.660449 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:46 crc kubenswrapper[4732]: E1010 06:52:46.660561 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:46 crc kubenswrapper[4732]: E1010 06:52:46.660654 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.711365 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.711434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.711455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.711476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.711494 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.815012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.815111 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.815135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.815207 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.815234 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.918233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.918304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.918321 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.918637 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:46 crc kubenswrapper[4732]: I1010 06:52:46.918667 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:46Z","lastTransitionTime":"2025-10-10T06:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.021579 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.021644 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.021659 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.021672 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.021732 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.124492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.124543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.124557 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.124576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.124590 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.228265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.228343 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.228368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.228400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.228424 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.331821 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.331935 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.331961 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.331991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.332014 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.434285 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.434328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.434342 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.434360 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.434373 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.537889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.537955 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.537979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.538008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.538030 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.641298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.641366 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.641383 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.641405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.641422 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.660400 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:47 crc kubenswrapper[4732]: E1010 06:52:47.660903 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.743907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.743960 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.743978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.744000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.744018 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.846234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.846312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.846323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.846340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.846352 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.948652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.948753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.948774 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.948797 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:47 crc kubenswrapper[4732]: I1010 06:52:47.948819 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:47Z","lastTransitionTime":"2025-10-10T06:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.051598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.051652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.051670 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.051722 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.051739 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.155262 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.155325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.155335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.155353 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.155364 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.257893 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.258004 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.258028 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.258057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.258080 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.360971 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.361064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.361083 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.361110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.361129 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.463311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.463363 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.463372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.463386 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.463395 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.565566 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.565624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.565637 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.565656 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.565669 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.660661 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:48 crc kubenswrapper[4732]: E1010 06:52:48.665903 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.665726 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:48 crc kubenswrapper[4732]: E1010 06:52:48.666254 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.667113 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:48 crc kubenswrapper[4732]: E1010 06:52:48.667235 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.667764 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 06:52:48 crc kubenswrapper[4732]: E1010 06:52:48.667928 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.669043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.669082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.669094 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.669108 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.669120 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.772315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.772412 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.772431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.772458 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.772476 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.875141 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.875175 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.875185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.875200 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.875210 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.977464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.977524 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.977535 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.977554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:48 crc kubenswrapper[4732]: I1010 06:52:48.977565 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:48Z","lastTransitionTime":"2025-10-10T06:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.080380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.080441 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.080452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.080471 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.080482 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.182985 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.183043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.183060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.183082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.183101 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.285519 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.285609 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.285626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.285674 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.285741 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.389549 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.389621 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.389648 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.389677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.389740 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.493134 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.493192 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.493213 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.493238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.493256 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.596610 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.596680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.596739 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.596777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.596799 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.661057 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:49 crc kubenswrapper[4732]: E1010 06:52:49.661218 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.699423 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.699473 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.699488 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.699509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.699523 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.803208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.803570 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.803821 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.804021 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.804209 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.907137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.907170 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.907181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.907198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:49 crc kubenswrapper[4732]: I1010 06:52:49.907212 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:49Z","lastTransitionTime":"2025-10-10T06:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.010407 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.010966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.011261 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.011460 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.011642 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.115373 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.115451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.115491 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.115523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.115546 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.218205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.218250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.218259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.218273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.218281 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.320532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.320587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.320598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.320613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.320623 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.422547 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.422589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.422599 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.422614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.422624 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.524877 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.524999 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.525017 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.525037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.525054 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.628392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.628469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.628497 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.628531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.628553 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.659558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.659614 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.659558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:50 crc kubenswrapper[4732]: E1010 06:52:50.659673 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:50 crc kubenswrapper[4732]: E1010 06:52:50.659965 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:50 crc kubenswrapper[4732]: E1010 06:52:50.660013 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.732183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.732249 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.732270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.732299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.732323 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.834953 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.835001 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.835018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.835042 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.835060 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.937683 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.937742 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.937758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.937779 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:50 crc kubenswrapper[4732]: I1010 06:52:50.937794 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:50Z","lastTransitionTime":"2025-10-10T06:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.007081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.007114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.007122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.007135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.007146 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: E1010 06:52:51.017661 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.021202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.021246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.021259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.021292 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.021307 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: E1010 06:52:51.032100 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.035886 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.035924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.035934 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.035949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.036214 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: E1010 06:52:51.051136 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.055740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.055793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.055807 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.055821 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.055833 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: E1010 06:52:51.066754 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.069993 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.070068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.070101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.070130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.070151 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: E1010 06:52:51.081947 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:51Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:51 crc kubenswrapper[4732]: E1010 06:52:51.082260 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.084053 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.084076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.084084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.084100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.084117 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.186230 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.186297 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.186318 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.186346 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.186379 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.288902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.288967 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.288989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.289016 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.289036 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.392208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.392252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.392264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.392282 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.392295 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.494179 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.494231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.494243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.494260 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.494272 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.596910 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.596947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.596959 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.596974 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.596986 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.660289 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:51 crc kubenswrapper[4732]: E1010 06:52:51.660467 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.699465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.699507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.699518 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.699533 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.699544 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.802835 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.802874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.802884 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.802925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.802938 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.905078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.905145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.905160 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.905201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:51 crc kubenswrapper[4732]: I1010 06:52:51.905241 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:51Z","lastTransitionTime":"2025-10-10T06:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.008109 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.008224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.008243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.008267 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.008285 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.111243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.111296 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.111313 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.111334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.111349 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.213034 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.213069 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.213079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.213093 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.213101 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.315460 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.315514 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.315527 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.315546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.315565 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.418136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.418422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.418528 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.418623 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.418734 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.521783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.521837 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.521854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.521876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.521896 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.623757 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.623861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.623873 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.623888 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.623898 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.659952 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:52 crc kubenswrapper[4732]: E1010 06:52:52.660136 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.660003 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:52 crc kubenswrapper[4732]: E1010 06:52:52.660262 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.660003 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:52 crc kubenswrapper[4732]: E1010 06:52:52.660544 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.726088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.726142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.726156 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.726176 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.726190 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.829452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.829968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.830037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.830110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.830214 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.933034 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.933413 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.933552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.933689 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:52 crc kubenswrapper[4732]: I1010 06:52:52.933870 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:52Z","lastTransitionTime":"2025-10-10T06:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.037033 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.037070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.037079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.037093 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.037102 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.144422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.144462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.144473 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.144512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.144524 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.247545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.247610 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.247627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.247649 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.247664 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.350038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.350158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.350185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.350214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.350235 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.452604 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.452647 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.452657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.452700 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.452713 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.555762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.555810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.555821 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.555839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.555852 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.659082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.659119 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.659128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.659142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.659150 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.659343 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:53 crc kubenswrapper[4732]: E1010 06:52:53.659488 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.685029 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnlkp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d94cc3c3-3cb6-4a5b-996b-90099415f9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:25Z\\\",\\\"message\\\":\\\"2025-10-10T06:51:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f\\\\n2025-10-10T06:51:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_31f47ecd-e1b5-4215-bbad-cb89d2a2169f to /host/opt/cni/bin/\\\\n2025-10-10T06:51:40Z [verbose] multus-daemon started\\\\n2025-10-10T06:51:40Z [verbose] Readiness Indicator file check\\\\n2025-10-10T06:52:25Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:52:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4fnwz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnlkp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.698434 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77abff23-1622-4219-a841-49fe8dbb6cc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-spk7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mj7bk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.716067 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f96e67c70612002c0a71bd469b30e93a5bd5f7e47f383c90c948c0592240aa7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5268547ed9b36109187d233d54c8f205187c29736582419d6fe2180f8ba1e55f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.729641 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3824c106845135f2596e69979c3538a1654923fe2206221eae0bfc5f5b51e469\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.744560 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea645a2f1d55d40000a7caf8919c6a3b54c656002322af047f78021d1ba3f5db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.759477 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aef461dc-7905-4f5e-a90e-046ffcf8258d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2ad9781e115363e09d7f8a313b7bcd0c80c859be9f44bc64d572cae49b30503\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3d5e4399747304ff650c032e302ec2775b026b20b5ca5550436bd8c952c918fc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://738bed52b54e0d8650d401cba0a36f2fa3ceaae5e9ea22ffdfeec50a9353bf99\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91b5153dfdf7f7a5c5d91460f03f5a9a547ccea48ebbeacea62858b1b160d77a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6cd3ed06ce02a70948834548ce557e2af2dd2bcb974476e65fb111b11f654d2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6204c9d1fe372db0cf0b29cf8d9a041479f5024f6b292f59526c53861c858b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a649516c06036d1268769c887e8c96f1c12f0bb1e18a90481fb9045df736da6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qlqtl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2fxmr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.760746 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.760781 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.760792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.760810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.760822 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.776752 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-10T06:52:34Z\\\",\\\"message\\\":\\\"lude.release.openshift.io/self-managed-high-availability:true include.release.openshift.io/single-node-developer:true service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc007b72dcf \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: service-ca-operator,},ClusterIP:10.217.4.40,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.40],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1010 06:52:34.478441 6773 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w\\\\nI1010 06:52:34.478446 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:52:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgdch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kdb2x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.806404 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8c44500-0108-47e2-9bca-fb313d2bc047\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55147e4e7303831e82636a39e93739e81f8f0e917c35cf54916877ccabb35b47\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4bcc752999ab2ab4f9d6ae33eb3f6aef94361ee7446c11d67ed875143794f7de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0126cc38bfdabf1d0049d6491e3c62ca968e3369a8cbff16450e80083ef31aa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://da86d8b1efd1aa20c7d159bfe3a78c85b691aecf3475894c57537c76014a0e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e496813da7ac76a023244a32fb5bbb7531f62a8b0953c7ed6d07aad1e1c141ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c0d317c8f0ec223d8c509b5adfb8bd53081238bf7c95ca0df0dcf9244ed8e1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://36ad69bd9facb2228ad0318eb4c1d4468787fdde2b1f185d0d80104013192ef9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc300905cbba5d3a201ba02f8e0128a9e8509ec6d5fe8f4fbfe668640674b2e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.820223 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b12e3e9c-b481-492c-8963-ae02431fcc75\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:52:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b0395b08d115dcd43d6aa3f32fab86b866854fede37a941cb64413866d9cb1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b90a41b40a2afa48bb881e96b9831dcee5ef761a9d74e8818c754af9aa1edbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2431d5edc5a61b974e1fcae7cb5513561ecbcedc4600e93458b6da78c423dc0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7bad256ee810fdf6a67fe1d3ebdaf424c96b8e4a362522eaafd871cfec59bbee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.834444 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ca39c55-1a82-41b2-b7d5-925320a4e8a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63be00a80f60cc70b36e97d12d836f6204d53e51e63061951f1cc86cc57a1649\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46xph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-292kd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.846193 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f8094ef-2a6d-4a6c-add7-628eff37abe3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f2101d5f247fa819b83a17de95e47b5f7716f26caaa697970b6e8283e0d03cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6388503df8fbd52d9af256d51bbb6a44e32cddd6e2df8fda36a9bac0beaf0dbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xxrsh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:50Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tvj9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.862121 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"913ead09-c1b2-4e36-9a0e-23ecca3566af\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ffa4d1bf7107234fe7fd3271c35af19c5d60d08a917f84ac7cc922294a57c70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d43bae2d582cb03b4d38c8d3db599de72c83008c4c28e1a219ebd37ec642351\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7d19cf071f115a608584eaecf8102f8115229e2dd999d5a4d91382465060f9f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c17d5ee0a9a8e62f1c39333f8b0168fdf51c1bda359cbb42defa729534594b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b80234de0ea30a91843e2bcc4afc7f4fd873f6778bee0a29c3d8df0350e121eb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1010 06:51:27.075052 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1010 06:51:27.077749 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-467612861/tls.crt::/tmp/serving-cert-467612861/tls.key\\\\\\\"\\\\nI1010 06:51:32.498992 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1010 06:51:32.506102 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1010 06:51:32.506153 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1010 06:51:32.506210 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1010 06:51:32.506225 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1010 06:51:32.515449 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1010 06:51:32.515493 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1010 06:51:32.515510 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515522 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1010 06:51:32.515530 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1010 06:51:32.515536 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1010 06:51:32.515543 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1010 06:51:32.515548 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1010 06:51:32.519310 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://354778d5de62eca5e7a3bbc36b7243e221b3a8ed3eec47719a9433a6790343fb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:16Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5be5ca0ce0107e9dec8dbde2f7571bd1e7fc736c4adbe1b414739e53b2e1787\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.863140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.863175 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.863186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.863201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.863213 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.874835 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.885506 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.897943 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:32Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.908464 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5r28v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b9d04cf-acc2-45e0-8e1c-23c28c061af4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bacd4edc7b44d0019cc3d00d108e612c3aa84c872a9f81eb8bd2df27b8a45bfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fnk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5r28v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.920321 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jn2jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c6e2c9ca-34c4-4d36-9ac4-0e1f6b665737\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://055052f7332ae585f368de2966db345680fd6a210479720afc152185e736fe26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8hwpn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:40Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jn2jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.931445 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ec73284-a45c-4ff7-ac28-3a28c372f1fd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f3d64deaaf28d793ea2b89c18ca8c62f4b4d73bfa24e53b2af5d99e2639f3feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0a774637721fb962768d82125583523b0c046a30496d409dc181074e3de13a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-10T06:51:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.945258 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d759bf72-8239-4223-a21c-6f169e2ea75c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T06:51:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c15706d43c14d6d9434032b41b7b13fd51142b33b064c65deb1bb913d1a4c199\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70df296ec2dd361976452080a1dfeb0db7a9d797b3946c46852b7a1b33c79b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16e906c1862b52bdefe9ab9f00480363eb8421256a856b87b17e17b208942080\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f310dab6f632de5cf27d5aae5bc64420ed7ead5d7ac559ddb07290c9a564220a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-10T06:51:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T06:51:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:52:53Z is after 2025-08-24T17:21:41Z" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.965927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.965970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.966013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.966032 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:53 crc kubenswrapper[4732]: I1010 06:52:53.966045 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:53Z","lastTransitionTime":"2025-10-10T06:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.069297 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.069357 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.069381 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.069411 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.069434 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.171656 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.171707 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.171720 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.171734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.171744 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.274048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.274108 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.274125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.274558 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.274613 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.376773 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.376830 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.376857 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.376880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.376894 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.479233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.479264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.479273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.479287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.479297 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.581870 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.581913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.581925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.581941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.581957 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.660198 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.660227 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.660285 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:54 crc kubenswrapper[4732]: E1010 06:52:54.660461 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:54 crc kubenswrapper[4732]: E1010 06:52:54.660628 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:54 crc kubenswrapper[4732]: E1010 06:52:54.660769 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.685278 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.685304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.685314 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.685329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.685343 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.787439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.787502 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.787525 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.787552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.787577 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.890660 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.890962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.891044 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.891106 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.891192 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.994329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.994871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.995115 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.995335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:54 crc kubenswrapper[4732]: I1010 06:52:54.995497 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:54Z","lastTransitionTime":"2025-10-10T06:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.097425 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.097450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.097459 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.097473 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.097483 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.200298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.200661 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.201007 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.201180 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.201363 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.304283 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.304910 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.305092 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.305299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.305451 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.408346 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.408396 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.408411 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.408429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.408441 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.511902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.512396 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.512554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.512732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.512874 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.615865 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.616081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.616192 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.616286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.616361 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.659886 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:55 crc kubenswrapper[4732]: E1010 06:52:55.660090 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.719361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.720205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.720347 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.720476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.720610 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.824336 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.824405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.824427 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.824454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.824491 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.927045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.927265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.927372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.927460 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:55 crc kubenswrapper[4732]: I1010 06:52:55.927535 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:55Z","lastTransitionTime":"2025-10-10T06:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.030018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.030060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.030068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.030080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.030089 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.133187 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.133265 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.133286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.133314 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.133335 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.238012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.238081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.238106 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.238154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.238179 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.340614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.340660 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.340672 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.340712 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.340724 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.409864 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:56 crc kubenswrapper[4732]: E1010 06:52:56.410042 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:56 crc kubenswrapper[4732]: E1010 06:52:56.410093 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs podName:77abff23-1622-4219-a841-49fe8dbb6cc3 nodeName:}" failed. No retries permitted until 2025-10-10 06:54:00.410076131 +0000 UTC m=+167.479667382 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs") pod "network-metrics-daemon-mj7bk" (UID: "77abff23-1622-4219-a841-49fe8dbb6cc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.442652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.442921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.443071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.443196 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.443345 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.545777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.545815 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.545823 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.545838 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.545848 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.648301 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.648897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.649146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.649405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.649624 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.660015 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.660054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.660218 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:56 crc kubenswrapper[4732]: E1010 06:52:56.660366 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:56 crc kubenswrapper[4732]: E1010 06:52:56.660393 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:56 crc kubenswrapper[4732]: E1010 06:52:56.660517 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.753155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.753219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.753238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.753262 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.753281 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.855935 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.855976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.855987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.856003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.856015 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.958872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.958921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.958933 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.958949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:56 crc kubenswrapper[4732]: I1010 06:52:56.958962 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:56Z","lastTransitionTime":"2025-10-10T06:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.063927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.063979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.063987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.064000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.064011 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.166804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.166861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.166876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.166897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.166912 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.268791 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.268829 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.268838 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.268850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.268860 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.371149 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.371230 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.371247 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.371272 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.371290 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.474092 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.474185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.474208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.474238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.474259 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.577560 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.577627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.577652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.577682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.577760 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.660203 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:57 crc kubenswrapper[4732]: E1010 06:52:57.660442 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.680872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.680941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.680962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.680985 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.681004 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.783852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.783896 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.783909 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.783927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.783939 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.885982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.886259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.886441 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.886578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.886733 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.989899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.990224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.990358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.990485 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:57 crc kubenswrapper[4732]: I1010 06:52:57.990598 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:57Z","lastTransitionTime":"2025-10-10T06:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.093642 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.093683 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.093713 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.093728 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.093738 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.196422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.196710 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.196784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.196890 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.196949 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.299506 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.299546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.299559 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.299578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.299591 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.402436 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.402477 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.402506 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.402528 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.402550 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.505429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.505476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.505492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.505516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.505533 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.608294 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.608758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.609045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.609202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.609345 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.659832 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.659912 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:52:58 crc kubenswrapper[4732]: E1010 06:52:58.660029 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.659855 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:52:58 crc kubenswrapper[4732]: E1010 06:52:58.660283 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:52:58 crc kubenswrapper[4732]: E1010 06:52:58.660432 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.715051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.715162 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.715193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.715232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.715272 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.820020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.820094 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.820108 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.820132 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.820145 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.923422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.924087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.924142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.924174 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:58 crc kubenswrapper[4732]: I1010 06:52:58.924199 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:58Z","lastTransitionTime":"2025-10-10T06:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.027615 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.027655 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.027665 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.027680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.027720 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.130291 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.130367 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.130392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.130424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.130448 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.233290 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.233351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.233361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.233377 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.233386 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.337076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.337161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.337182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.337221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.337244 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.440626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.440735 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.440760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.440790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.440813 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.544577 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.544668 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.544717 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.544751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.544776 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.647236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.647278 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.647288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.647308 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.647322 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.660035 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:52:59 crc kubenswrapper[4732]: E1010 06:52:59.660227 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.749464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.749507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.749515 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.749529 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.749539 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.852610 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.852667 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.852675 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.852711 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.852722 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.955405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.955469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.955486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.955512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:52:59 crc kubenswrapper[4732]: I1010 06:52:59.955529 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:52:59Z","lastTransitionTime":"2025-10-10T06:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.058168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.058504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.058640 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.058815 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.058938 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.161771 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.161810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.161824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.161839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.161852 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.264250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.264288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.264299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.264315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.264326 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.366368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.366433 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.366448 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.366465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.366476 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.469121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.469164 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.469177 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.469195 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.469213 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.572399 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.572469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.572487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.572513 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.572533 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.659679 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.659770 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:00 crc kubenswrapper[4732]: E1010 06:53:00.659898 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.659956 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:00 crc kubenswrapper[4732]: E1010 06:53:00.660000 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:00 crc kubenswrapper[4732]: E1010 06:53:00.660121 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.661268 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 06:53:00 crc kubenswrapper[4732]: E1010 06:53:00.661511 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.674973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.675013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.675029 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.675049 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.675066 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.777164 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.777221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.777240 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.777261 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.777276 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.880384 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.880454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.880477 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.880506 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.880524 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.983843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.983907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.983922 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.983954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:00 crc kubenswrapper[4732]: I1010 06:53:00.983972 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:00Z","lastTransitionTime":"2025-10-10T06:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.086611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.086747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.086773 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.086806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.086829 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.189257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.189311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.189326 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.189346 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.189372 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.292124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.292168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.292178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.292193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.292204 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.360161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.360201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.360211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.360223 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.360232 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: E1010 06:53:01.372304 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:53:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.376156 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.376322 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.376385 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.376455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.376522 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: E1010 06:53:01.390121 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:53:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.395064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.395323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.395447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.395569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.395673 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: E1010 06:53:01.410724 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:53:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.416418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.416487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.416504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.416525 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.416540 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: E1010 06:53:01.431231 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:53:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.435387 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.435527 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.435601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.435720 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.435814 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: E1010 06:53:01.446864 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-10T06:53:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"677988c9-53ea-44ee-b7e0-55b4b6597681\\\",\\\"systemUUID\\\":\\\"f97cf68a-a91c-438d-bef2-b95519e23c5d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-10T06:53:01Z is after 2025-08-24T17:21:41Z" Oct 10 06:53:01 crc kubenswrapper[4732]: E1010 06:53:01.447220 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.448932 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.448989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.449004 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.449031 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.449043 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.551169 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.551444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.551541 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.551639 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.551742 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.655509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.655852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.656043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.656317 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.656731 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.659913 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:01 crc kubenswrapper[4732]: E1010 06:53:01.660009 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.760365 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.760446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.760459 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.760479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.760491 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.863133 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.863203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.863229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.863257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.863276 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.965968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.966017 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.966026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.966042 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:01 crc kubenswrapper[4732]: I1010 06:53:01.966053 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:01Z","lastTransitionTime":"2025-10-10T06:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.069286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.069356 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.069371 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.069400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.069417 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.172169 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.172229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.172241 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.172264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.172277 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.275682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.275745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.275756 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.275772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.275784 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.378061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.378126 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.378144 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.378171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.378190 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.481677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.481766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.481784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.481812 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.481830 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.584822 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.584876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.584888 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.584908 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.584920 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.659858 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.659968 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.659892 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:02 crc kubenswrapper[4732]: E1010 06:53:02.660115 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:02 crc kubenswrapper[4732]: E1010 06:53:02.660269 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:02 crc kubenswrapper[4732]: E1010 06:53:02.660429 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.688147 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.688207 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.688220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.688241 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.688257 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.791450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.791536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.791551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.791576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.791589 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.900547 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.900601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.900613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.900633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:02 crc kubenswrapper[4732]: I1010 06:53:02.900645 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:02Z","lastTransitionTime":"2025-10-10T06:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.004291 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.004370 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.004395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.004424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.004446 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.106922 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.106972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.106987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.107008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.107023 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.210329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.210389 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.210406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.210430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.210448 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.312808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.312850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.312861 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.312876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.312886 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.416295 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.416370 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.416395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.416426 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.416450 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.518357 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.518397 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.518407 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.518421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.518430 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.621001 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.621056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.621073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.621096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.621113 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.659787 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:03 crc kubenswrapper[4732]: E1010 06:53:03.660014 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.722924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.722991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.723007 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.723034 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.723052 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.731740 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pnlkp" podStartSLOduration=85.731717287 podStartE2EDuration="1m25.731717287s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.719072193 +0000 UTC m=+110.788663444" watchObservedRunningTime="2025-10-10 06:53:03.731717287 +0000 UTC m=+110.801308528" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.766141 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=87.766125584 podStartE2EDuration="1m27.766125584s" podCreationTimestamp="2025-10-10 06:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.765521248 +0000 UTC m=+110.835112499" watchObservedRunningTime="2025-10-10 06:53:03.766125584 +0000 UTC m=+110.835716825" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.817805 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=54.817786535 podStartE2EDuration="54.817786535s" podCreationTimestamp="2025-10-10 06:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.793995028 +0000 UTC m=+110.863586289" watchObservedRunningTime="2025-10-10 06:53:03.817786535 +0000 UTC m=+110.887377766" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.825600 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.825645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.825653 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.825668 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.825677 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.878973 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2fxmr" podStartSLOduration=85.878953418 podStartE2EDuration="1m25.878953418s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.84603289 +0000 UTC m=+110.915624151" watchObservedRunningTime="2025-10-10 06:53:03.878953418 +0000 UTC m=+110.948544659" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.895554 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.895536955 podStartE2EDuration="1m31.895536955s" podCreationTimestamp="2025-10-10 06:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.895269948 +0000 UTC m=+110.964861209" watchObservedRunningTime="2025-10-10 06:53:03.895536955 +0000 UTC m=+110.965128196" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.927526 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.927563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.927572 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.927584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.927593 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:03Z","lastTransitionTime":"2025-10-10T06:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.936625 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podStartSLOduration=85.936604597 podStartE2EDuration="1m25.936604597s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.931737469 +0000 UTC m=+111.001328740" watchObservedRunningTime="2025-10-10 06:53:03.936604597 +0000 UTC m=+111.006195838" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.963192 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tvj9w" podStartSLOduration=85.963169617 podStartE2EDuration="1m25.963169617s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.949745594 +0000 UTC m=+111.019336845" watchObservedRunningTime="2025-10-10 06:53:03.963169617 +0000 UTC m=+111.032760858" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.976894 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=45.976876779 podStartE2EDuration="45.976876779s" podCreationTimestamp="2025-10-10 06:52:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.964202295 +0000 UTC m=+111.033793536" watchObservedRunningTime="2025-10-10 06:53:03.976876779 +0000 UTC m=+111.046468020" Oct 10 06:53:03 crc kubenswrapper[4732]: I1010 06:53:03.990414 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=90.990391695 podStartE2EDuration="1m30.990391695s" podCreationTimestamp="2025-10-10 06:51:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:03.977517866 +0000 UTC m=+111.047109107" watchObservedRunningTime="2025-10-10 06:53:03.990391695 +0000 UTC m=+111.059982936" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.027454 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5r28v" podStartSLOduration=86.027432091 podStartE2EDuration="1m26.027432091s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:04.013877284 +0000 UTC m=+111.083468525" watchObservedRunningTime="2025-10-10 06:53:04.027432091 +0000 UTC m=+111.097023342" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.027776 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jn2jn" podStartSLOduration=86.02777215 podStartE2EDuration="1m26.02777215s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:04.027298938 +0000 UTC m=+111.096890189" watchObservedRunningTime="2025-10-10 06:53:04.02777215 +0000 UTC m=+111.097363391" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.029682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.029777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.029794 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.029816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.029831 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.132311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.132363 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.132376 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.132393 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.132405 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.234881 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.234930 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.234942 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.234957 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.234970 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.337805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.337856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.337869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.337887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.337901 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.440950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.441023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.441050 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.441082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.441108 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.544259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.544305 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.544317 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.544334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.544346 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.647747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.647806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.647824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.647847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.647865 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.660108 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.660169 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.660108 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:04 crc kubenswrapper[4732]: E1010 06:53:04.660352 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:04 crc kubenswrapper[4732]: E1010 06:53:04.660477 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:04 crc kubenswrapper[4732]: E1010 06:53:04.660668 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.750415 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.750455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.750465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.750481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.750491 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.852797 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.852833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.852841 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.852857 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.852869 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.955207 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.955235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.955244 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.955256 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:04 crc kubenswrapper[4732]: I1010 06:53:04.955264 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:04Z","lastTransitionTime":"2025-10-10T06:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.058038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.058101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.058113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.058130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.058144 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.161291 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.161330 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.161343 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.161358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.161372 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.263755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.263817 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.263830 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.263849 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.263862 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.366924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.367496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.367753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.367976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.368169 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.471533 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.472085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.472329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.472542 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.472807 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.576815 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.577221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.577466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.577751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.577973 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.660094 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:05 crc kubenswrapper[4732]: E1010 06:53:05.660779 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.680590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.680659 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.680677 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.680749 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.680785 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.783872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.784307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.784478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.784636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.784818 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.888130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.888180 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.888196 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.888218 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.888236 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.991072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.991518 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.991796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.991918 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:05 crc kubenswrapper[4732]: I1010 06:53:05.992010 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:05Z","lastTransitionTime":"2025-10-10T06:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.095815 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.096713 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.096813 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.096901 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.096988 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.200364 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.200417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.200434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.200457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.200476 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.303340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.303747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.303906 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.304046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.304202 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.407914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.408271 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.408442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.408646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.408806 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.511599 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.511670 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.511729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.511762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.511784 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.615074 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.615127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.615146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.615170 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.615189 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.659685 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:06 crc kubenswrapper[4732]: E1010 06:53:06.660087 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.659849 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:06 crc kubenswrapper[4732]: E1010 06:53:06.660336 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.659781 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:06 crc kubenswrapper[4732]: E1010 06:53:06.660574 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.718856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.718926 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.718943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.718966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.718985 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.822087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.822162 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.822181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.822205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.822224 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.925197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.925261 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.925280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.925306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:06 crc kubenswrapper[4732]: I1010 06:53:06.925324 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:06Z","lastTransitionTime":"2025-10-10T06:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.027739 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.028117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.028836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.028880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.028900 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.131652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.132248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.132341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.132455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.132578 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.235315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.235374 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.235390 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.235413 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.235429 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.338538 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.338583 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.338595 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.338618 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.338632 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.441996 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.442078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.442091 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.442109 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.442122 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.545241 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.545307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.545329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.545360 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.545383 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.648479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.648544 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.648555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.648569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.648596 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.659970 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:07 crc kubenswrapper[4732]: E1010 06:53:07.660335 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.751387 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.751455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.751481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.751511 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.751535 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.853634 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.853679 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.853720 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.853740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.853750 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.956809 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.956843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.956856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.956872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:07 crc kubenswrapper[4732]: I1010 06:53:07.956884 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:07Z","lastTransitionTime":"2025-10-10T06:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.060183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.060257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.060280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.060326 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.060360 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.164008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.164062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.164080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.164107 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.164124 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.267007 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.267062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.267079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.267104 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.267123 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.369347 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.369392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.369441 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.369457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.369481 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.471604 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.471673 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.471729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.471763 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.471785 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.574414 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.574466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.574481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.574502 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.574517 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.659903 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.660026 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:08 crc kubenswrapper[4732]: E1010 06:53:08.660056 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.659908 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:08 crc kubenswrapper[4732]: E1010 06:53:08.660215 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:08 crc kubenswrapper[4732]: E1010 06:53:08.660295 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.677576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.677621 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.677634 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.677650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.677662 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.780783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.780825 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.780836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.780852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.780863 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.884072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.884110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.884120 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.884135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.884147 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.987018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.987055 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.987063 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.987076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:08 crc kubenswrapper[4732]: I1010 06:53:08.987086 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:08Z","lastTransitionTime":"2025-10-10T06:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.089143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.089189 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.089203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.089219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.089230 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.191831 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.191896 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.191916 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.191942 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.191961 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.294003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.294053 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.294064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.294082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.294098 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.396850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.396923 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.396940 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.396967 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.396986 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.499804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.499874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.499898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.499931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.499955 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.602424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.602507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.602533 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.602604 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.602630 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.659495 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:09 crc kubenswrapper[4732]: E1010 06:53:09.659664 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.705615 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.705933 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.706254 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.706375 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.706464 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.810012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.810730 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.811181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.811679 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.812063 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.916219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.916270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.916286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.916309 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:09 crc kubenswrapper[4732]: I1010 06:53:09.916325 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:09Z","lastTransitionTime":"2025-10-10T06:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.019760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.019829 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.019846 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.019874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.019891 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.122146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.122488 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.122598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.122755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.122864 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.226089 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.226130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.226139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.226155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.226164 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.328488 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.328533 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.328545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.328571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.328594 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.430736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.430779 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.430790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.430808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.430819 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.533020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.533057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.533068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.533087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.533103 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.636062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.636103 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.636115 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.636133 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.636146 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.659370 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.659380 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.659459 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:10 crc kubenswrapper[4732]: E1010 06:53:10.659956 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:10 crc kubenswrapper[4732]: E1010 06:53:10.660051 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:10 crc kubenswrapper[4732]: E1010 06:53:10.660162 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.738582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.738631 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.738648 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.738671 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.738754 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.841592 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.841673 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.841733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.841762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.841781 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.944892 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.945201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.945294 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.945381 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:10 crc kubenswrapper[4732]: I1010 06:53:10.945491 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:10Z","lastTransitionTime":"2025-10-10T06:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.054362 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.054421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.054438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.054464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.054481 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:11Z","lastTransitionTime":"2025-10-10T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.156501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.156575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.156600 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.156667 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.156734 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:11Z","lastTransitionTime":"2025-10-10T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.259394 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.260071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.260215 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.260388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.260677 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:11Z","lastTransitionTime":"2025-10-10T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.411443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.411503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.411520 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.411539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.411553 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:11Z","lastTransitionTime":"2025-10-10T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.513900 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.513938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.513950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.513970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.513985 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:11Z","lastTransitionTime":"2025-10-10T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.616445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.616776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.616786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.616799 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.616809 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:11Z","lastTransitionTime":"2025-10-10T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.659970 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:11 crc kubenswrapper[4732]: E1010 06:53:11.660118 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.663672 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.663723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.663731 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.663744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.663753 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-10T06:53:11Z","lastTransitionTime":"2025-10-10T06:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.726503 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q"] Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.727121 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.730195 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.730407 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.730503 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.739560 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.815473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8200b4e9-540a-431f-aa14-8380052d7a65-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.815529 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8200b4e9-540a-431f-aa14-8380052d7a65-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.815545 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8200b4e9-540a-431f-aa14-8380052d7a65-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.815562 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8200b4e9-540a-431f-aa14-8380052d7a65-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.815584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8200b4e9-540a-431f-aa14-8380052d7a65-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.916425 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8200b4e9-540a-431f-aa14-8380052d7a65-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.916488 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8200b4e9-540a-431f-aa14-8380052d7a65-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.916505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8200b4e9-540a-431f-aa14-8380052d7a65-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.916520 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8200b4e9-540a-431f-aa14-8380052d7a65-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.916543 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8200b4e9-540a-431f-aa14-8380052d7a65-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.916597 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8200b4e9-540a-431f-aa14-8380052d7a65-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.916982 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8200b4e9-540a-431f-aa14-8380052d7a65-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.918367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8200b4e9-540a-431f-aa14-8380052d7a65-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.925329 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8200b4e9-540a-431f-aa14-8380052d7a65-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:11 crc kubenswrapper[4732]: I1010 06:53:11.931790 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8200b4e9-540a-431f-aa14-8380052d7a65-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pg68q\" (UID: \"8200b4e9-540a-431f-aa14-8380052d7a65\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.045230 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.245323 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/1.log" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.246144 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/0.log" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.246179 4732 generic.go:334] "Generic (PLEG): container finished" podID="d94cc3c3-3cb6-4a5b-996b-90099415f9bf" containerID="c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209" exitCode=1 Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.246231 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerDied","Data":"c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209"} Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.246264 4732 scope.go:117] "RemoveContainer" containerID="8bc8649decbdeed6c0b4ec2cfbd51ceaa347f2005ca620571b2da2bb5e39d4ef" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.246599 4732 scope.go:117] "RemoveContainer" containerID="c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209" Oct 10 06:53:12 crc kubenswrapper[4732]: E1010 06:53:12.246760 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pnlkp_openshift-multus(d94cc3c3-3cb6-4a5b-996b-90099415f9bf)\"" pod="openshift-multus/multus-pnlkp" podUID="d94cc3c3-3cb6-4a5b-996b-90099415f9bf" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.248767 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" event={"ID":"8200b4e9-540a-431f-aa14-8380052d7a65","Type":"ContainerStarted","Data":"d54da7668517b04f6889d25384f67ab101349fc9d0750546a198a2420bb8bcce"} Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.248859 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" event={"ID":"8200b4e9-540a-431f-aa14-8380052d7a65","Type":"ContainerStarted","Data":"3e9309531a21e97914786a18fcd91850c0205962f623c762d9eed523df49a18a"} Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.284320 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pg68q" podStartSLOduration=94.284301527 podStartE2EDuration="1m34.284301527s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:12.28362681 +0000 UTC m=+119.353218121" watchObservedRunningTime="2025-10-10 06:53:12.284301527 +0000 UTC m=+119.353892778" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.659572 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.659589 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:12 crc kubenswrapper[4732]: E1010 06:53:12.659832 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:12 crc kubenswrapper[4732]: I1010 06:53:12.659608 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:12 crc kubenswrapper[4732]: E1010 06:53:12.659911 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:12 crc kubenswrapper[4732]: E1010 06:53:12.659987 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:13 crc kubenswrapper[4732]: I1010 06:53:13.253894 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/1.log" Oct 10 06:53:13 crc kubenswrapper[4732]: I1010 06:53:13.659603 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:13 crc kubenswrapper[4732]: E1010 06:53:13.660918 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:13 crc kubenswrapper[4732]: I1010 06:53:13.661857 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 06:53:13 crc kubenswrapper[4732]: E1010 06:53:13.662080 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kdb2x_openshift-ovn-kubernetes(f77a19b4-118c-4b7d-9ef2-b7be7fd33e63)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" Oct 10 06:53:13 crc kubenswrapper[4732]: E1010 06:53:13.671233 4732 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 10 06:53:13 crc kubenswrapper[4732]: E1010 06:53:13.788399 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:53:14 crc kubenswrapper[4732]: I1010 06:53:14.659933 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:14 crc kubenswrapper[4732]: I1010 06:53:14.660005 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:14 crc kubenswrapper[4732]: E1010 06:53:14.660089 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:14 crc kubenswrapper[4732]: I1010 06:53:14.660185 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:14 crc kubenswrapper[4732]: E1010 06:53:14.660337 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:14 crc kubenswrapper[4732]: E1010 06:53:14.660436 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:15 crc kubenswrapper[4732]: I1010 06:53:15.660084 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:15 crc kubenswrapper[4732]: E1010 06:53:15.660254 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:16 crc kubenswrapper[4732]: I1010 06:53:16.659535 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:16 crc kubenswrapper[4732]: I1010 06:53:16.659559 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:16 crc kubenswrapper[4732]: I1010 06:53:16.659560 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:16 crc kubenswrapper[4732]: E1010 06:53:16.659669 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:16 crc kubenswrapper[4732]: E1010 06:53:16.659845 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:16 crc kubenswrapper[4732]: E1010 06:53:16.659931 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:17 crc kubenswrapper[4732]: I1010 06:53:17.660261 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:17 crc kubenswrapper[4732]: E1010 06:53:17.660453 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:18 crc kubenswrapper[4732]: I1010 06:53:18.659580 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:18 crc kubenswrapper[4732]: E1010 06:53:18.659747 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:18 crc kubenswrapper[4732]: I1010 06:53:18.659599 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:18 crc kubenswrapper[4732]: E1010 06:53:18.659861 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:18 crc kubenswrapper[4732]: I1010 06:53:18.659580 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:18 crc kubenswrapper[4732]: E1010 06:53:18.659911 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:18 crc kubenswrapper[4732]: E1010 06:53:18.789768 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:53:19 crc kubenswrapper[4732]: I1010 06:53:19.659853 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:19 crc kubenswrapper[4732]: E1010 06:53:19.660345 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:20 crc kubenswrapper[4732]: I1010 06:53:20.659577 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:20 crc kubenswrapper[4732]: I1010 06:53:20.659623 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:20 crc kubenswrapper[4732]: E1010 06:53:20.659791 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:20 crc kubenswrapper[4732]: I1010 06:53:20.659591 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:20 crc kubenswrapper[4732]: E1010 06:53:20.659957 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:20 crc kubenswrapper[4732]: E1010 06:53:20.660234 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:21 crc kubenswrapper[4732]: I1010 06:53:21.659885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:21 crc kubenswrapper[4732]: E1010 06:53:21.660094 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:22 crc kubenswrapper[4732]: I1010 06:53:22.659469 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:22 crc kubenswrapper[4732]: I1010 06:53:22.659479 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:22 crc kubenswrapper[4732]: I1010 06:53:22.659863 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:22 crc kubenswrapper[4732]: E1010 06:53:22.660044 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:22 crc kubenswrapper[4732]: E1010 06:53:22.660209 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:22 crc kubenswrapper[4732]: E1010 06:53:22.660438 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:23 crc kubenswrapper[4732]: I1010 06:53:23.659907 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:23 crc kubenswrapper[4732]: E1010 06:53:23.661174 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:23 crc kubenswrapper[4732]: I1010 06:53:23.661630 4732 scope.go:117] "RemoveContainer" containerID="c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209" Oct 10 06:53:23 crc kubenswrapper[4732]: E1010 06:53:23.790400 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:53:24 crc kubenswrapper[4732]: I1010 06:53:24.291555 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/1.log" Oct 10 06:53:24 crc kubenswrapper[4732]: I1010 06:53:24.291665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerStarted","Data":"a37376247646e5fbbfa00dc811de67eb8ada41b85d70795bc6c90d092c37d808"} Oct 10 06:53:24 crc kubenswrapper[4732]: I1010 06:53:24.660168 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:24 crc kubenswrapper[4732]: E1010 06:53:24.660882 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:24 crc kubenswrapper[4732]: I1010 06:53:24.660238 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:24 crc kubenswrapper[4732]: I1010 06:53:24.660232 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:24 crc kubenswrapper[4732]: E1010 06:53:24.661139 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:24 crc kubenswrapper[4732]: E1010 06:53:24.661239 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:25 crc kubenswrapper[4732]: I1010 06:53:25.660016 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:25 crc kubenswrapper[4732]: E1010 06:53:25.660407 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:26 crc kubenswrapper[4732]: I1010 06:53:26.659348 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:26 crc kubenswrapper[4732]: I1010 06:53:26.659388 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:26 crc kubenswrapper[4732]: I1010 06:53:26.659348 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:26 crc kubenswrapper[4732]: E1010 06:53:26.659669 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:26 crc kubenswrapper[4732]: E1010 06:53:26.659794 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:26 crc kubenswrapper[4732]: E1010 06:53:26.659835 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:26 crc kubenswrapper[4732]: I1010 06:53:26.660375 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 06:53:27 crc kubenswrapper[4732]: I1010 06:53:27.303262 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/3.log" Oct 10 06:53:27 crc kubenswrapper[4732]: I1010 06:53:27.306060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerStarted","Data":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} Oct 10 06:53:27 crc kubenswrapper[4732]: I1010 06:53:27.306461 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:53:27 crc kubenswrapper[4732]: I1010 06:53:27.561211 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podStartSLOduration=109.561191696 podStartE2EDuration="1m49.561191696s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:27.342671297 +0000 UTC m=+134.412262558" watchObservedRunningTime="2025-10-10 06:53:27.561191696 +0000 UTC m=+134.630782927" Oct 10 06:53:27 crc kubenswrapper[4732]: I1010 06:53:27.562026 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mj7bk"] Oct 10 06:53:27 crc kubenswrapper[4732]: I1010 06:53:27.562101 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:27 crc kubenswrapper[4732]: E1010 06:53:27.562174 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:27 crc kubenswrapper[4732]: I1010 06:53:27.659406 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:27 crc kubenswrapper[4732]: E1010 06:53:27.659558 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:28 crc kubenswrapper[4732]: I1010 06:53:28.659219 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:28 crc kubenswrapper[4732]: I1010 06:53:28.659272 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:28 crc kubenswrapper[4732]: E1010 06:53:28.659358 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:28 crc kubenswrapper[4732]: E1010 06:53:28.659491 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:28 crc kubenswrapper[4732]: E1010 06:53:28.810191 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 06:53:29 crc kubenswrapper[4732]: I1010 06:53:29.659248 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:29 crc kubenswrapper[4732]: E1010 06:53:29.659436 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:29 crc kubenswrapper[4732]: I1010 06:53:29.659677 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:29 crc kubenswrapper[4732]: E1010 06:53:29.659837 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:30 crc kubenswrapper[4732]: I1010 06:53:30.659649 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:30 crc kubenswrapper[4732]: I1010 06:53:30.659656 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:30 crc kubenswrapper[4732]: E1010 06:53:30.660005 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:30 crc kubenswrapper[4732]: E1010 06:53:30.660106 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:31 crc kubenswrapper[4732]: I1010 06:53:31.659389 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:31 crc kubenswrapper[4732]: E1010 06:53:31.659556 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:31 crc kubenswrapper[4732]: I1010 06:53:31.659832 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:31 crc kubenswrapper[4732]: E1010 06:53:31.660016 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:32 crc kubenswrapper[4732]: I1010 06:53:32.659673 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:32 crc kubenswrapper[4732]: I1010 06:53:32.659760 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:32 crc kubenswrapper[4732]: E1010 06:53:32.659869 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 10 06:53:32 crc kubenswrapper[4732]: E1010 06:53:32.659954 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 10 06:53:33 crc kubenswrapper[4732]: I1010 06:53:33.659143 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:33 crc kubenswrapper[4732]: I1010 06:53:33.659227 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:33 crc kubenswrapper[4732]: E1010 06:53:33.661289 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 10 06:53:33 crc kubenswrapper[4732]: E1010 06:53:33.661408 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mj7bk" podUID="77abff23-1622-4219-a841-49fe8dbb6cc3" Oct 10 06:53:34 crc kubenswrapper[4732]: I1010 06:53:34.659939 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:34 crc kubenswrapper[4732]: I1010 06:53:34.660460 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:34 crc kubenswrapper[4732]: I1010 06:53:34.663214 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 10 06:53:34 crc kubenswrapper[4732]: I1010 06:53:34.663845 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 10 06:53:34 crc kubenswrapper[4732]: I1010 06:53:34.664242 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 10 06:53:34 crc kubenswrapper[4732]: I1010 06:53:34.663932 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 10 06:53:35 crc kubenswrapper[4732]: I1010 06:53:35.659562 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:53:35 crc kubenswrapper[4732]: I1010 06:53:35.659587 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:35 crc kubenswrapper[4732]: I1010 06:53:35.663939 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 10 06:53:35 crc kubenswrapper[4732]: I1010 06:53:35.666567 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.518257 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:40 crc kubenswrapper[4732]: E1010 06:53:40.518432 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:55:42.518376159 +0000 UTC m=+269.587967440 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.519058 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.519182 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.520647 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.528372 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.620750 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.620839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.628606 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.628642 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.688490 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.698729 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 10 06:53:40 crc kubenswrapper[4732]: I1010 06:53:40.799425 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:40 crc kubenswrapper[4732]: W1010 06:53:40.941980 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-52c650adea32da534ca094d2face66f7140bb897b5ff6fbfa75ac3f1f9395dc5 WatchSource:0}: Error finding container 52c650adea32da534ca094d2face66f7140bb897b5ff6fbfa75ac3f1f9395dc5: Status 404 returned error can't find the container with id 52c650adea32da534ca094d2face66f7140bb897b5ff6fbfa75ac3f1f9395dc5 Oct 10 06:53:41 crc kubenswrapper[4732]: W1010 06:53:41.026328 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-6589631aaa4d2d5fe2c4961468d0d4ed73caed845349ce6a3dd809a65ada5261 WatchSource:0}: Error finding container 6589631aaa4d2d5fe2c4961468d0d4ed73caed845349ce6a3dd809a65ada5261: Status 404 returned error can't find the container with id 6589631aaa4d2d5fe2c4961468d0d4ed73caed845349ce6a3dd809a65ada5261 Oct 10 06:53:41 crc kubenswrapper[4732]: W1010 06:53:41.131919 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ac329c23c636fc5b0b7220084f41d862d9bec941d613499233225670814cc0fc WatchSource:0}: Error finding container ac329c23c636fc5b0b7220084f41d862d9bec941d613499233225670814cc0fc: Status 404 returned error can't find the container with id ac329c23c636fc5b0b7220084f41d862d9bec941d613499233225670814cc0fc Oct 10 06:53:41 crc kubenswrapper[4732]: I1010 06:53:41.357467 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a65f67acb1be0f4675ec96d5f5370d36c7ef2cf913eca42464961f99de21334d"} Oct 10 06:53:41 crc kubenswrapper[4732]: I1010 06:53:41.357568 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ac329c23c636fc5b0b7220084f41d862d9bec941d613499233225670814cc0fc"} Oct 10 06:53:41 crc kubenswrapper[4732]: I1010 06:53:41.360658 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"00c836169eb80d8071d8cbbd8e73a8a5cbb9f03a71d9189e4d594ad91d5dc57b"} Oct 10 06:53:41 crc kubenswrapper[4732]: I1010 06:53:41.360754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6589631aaa4d2d5fe2c4961468d0d4ed73caed845349ce6a3dd809a65ada5261"} Oct 10 06:53:41 crc kubenswrapper[4732]: I1010 06:53:41.361105 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:53:41 crc kubenswrapper[4732]: I1010 06:53:41.362578 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5fe7d4a384be4ad7d0ebaefade242eaacb25190764235fb14b5acfc925efb777"} Oct 10 06:53:41 crc kubenswrapper[4732]: I1010 06:53:41.362610 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"52c650adea32da534ca094d2face66f7140bb897b5ff6fbfa75ac3f1f9395dc5"} Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.366978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.462661 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2lfbx"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.463431 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.466826 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.466881 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.467321 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.469745 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6srp"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.470139 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.473213 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.482713 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.483405 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.483530 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.484551 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.484670 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.485141 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.485430 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.485534 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.485613 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.485730 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.485792 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.486330 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.486434 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.487678 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.488043 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8qd8h"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.488872 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.488158 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.489204 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fdshh"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.489435 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.489847 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z2jfv"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.490175 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.490214 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.490296 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.490596 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.490638 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.491026 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.491096 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.492488 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.493208 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.493334 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-kg7gq"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.493740 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.500542 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.500959 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.512906 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.513078 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.513281 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.513294 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-f7zpr"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.513355 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.513379 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.513750 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f7zpr" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.514861 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.515000 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.515007 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.515167 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.515263 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.515506 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.516305 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.519597 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.519910 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.519934 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.520244 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.520296 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.520248 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.520436 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.525129 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.525298 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.525787 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.525917 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.526132 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.526213 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.526244 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.526301 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.526396 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.526707 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.526958 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527131 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527276 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527362 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527438 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527505 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527745 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527750 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527863 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.527957 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.528046 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.529273 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bbf6g"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.529792 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8tq5m"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.530166 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.530652 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.531114 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.531375 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.531587 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.531773 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.531983 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.532015 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.532221 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.532379 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.535168 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.535459 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.536117 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.536411 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.536822 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.536940 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.536978 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.536984 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.537157 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.537448 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.537575 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xghqq"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.538073 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.538359 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.549656 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.550636 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.552101 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.553235 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.553763 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.553680 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.554252 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.558121 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.558726 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.560899 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579558 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-config\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579605 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4thm\" (UniqueName: \"kubernetes.io/projected/e8b58414-93da-4fc9-904b-1886401e00c8-kube-api-access-z4thm\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579628 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1f6a9a4-0043-442a-9f1a-6661546d2397-audit-dir\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579644 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-etcd-serving-ca\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579662 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-audit\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579675 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579705 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-encryption-config\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579720 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c1f6a9a4-0043-442a-9f1a-6661546d2397-node-pullsecrets\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579737 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-config\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579753 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjv9w\" (UniqueName: \"kubernetes.io/projected/c1f6a9a4-0043-442a-9f1a-6661546d2397-kube-api-access-mjv9w\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579811 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-serving-cert\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-image-import-ca\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579871 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b58414-93da-4fc9-904b-1886401e00c8-serving-cert\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579888 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-client-ca\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-etcd-client\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.579910 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.580003 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.580368 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.580916 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.581048 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.581416 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.581616 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.581773 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.581900 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.582025 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.583512 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.584404 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.585203 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.585716 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.588426 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.593661 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.594127 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.594511 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.595354 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tqc8v"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.595969 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.597261 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.597566 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wkrlj"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.597952 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.598833 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.599610 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.602448 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.603247 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.604390 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.604797 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6srp"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.605164 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.606011 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.607130 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2lfbx"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.617381 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.617758 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.617794 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.619074 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.628914 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.633393 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f7phv"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.635516 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.635990 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.636159 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.636455 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.637004 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.649038 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.651410 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.652454 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.653506 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g2xxs"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.653913 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.655185 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.662600 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.664544 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bbf6g"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.664759 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.672007 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.680937 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681401 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c1f6a9a4-0043-442a-9f1a-6661546d2397-node-pullsecrets\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681433 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681454 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99b7a779-8943-4774-b15e-959fa326d08d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v6gpp\" (UID: \"99b7a779-8943-4774-b15e-959fa326d08d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681474 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681493 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-config\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681510 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjv9w\" (UniqueName: \"kubernetes.io/projected/c1f6a9a4-0043-442a-9f1a-6661546d2397-kube-api-access-mjv9w\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681524 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nt9q\" (UniqueName: \"kubernetes.io/projected/99b7a779-8943-4774-b15e-959fa326d08d-kube-api-access-4nt9q\") pod \"cluster-samples-operator-665b6dd947-v6gpp\" (UID: \"99b7a779-8943-4774-b15e-959fa326d08d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681539 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c511d3-25e3-422a-b0b2-099b14de9a01-serving-cert\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681556 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681569 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvgsj\" (UniqueName: \"kubernetes.io/projected/524083e6-c56c-4c74-b700-ac668cb2022c-kube-api-access-wvgsj\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681613 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681629 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e143458e-3d6b-4b61-8811-c462db11f97f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681647 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681651 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681661 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-serving-cert\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681677 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-image-import-ca\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681752 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b58414-93da-4fc9-904b-1886401e00c8-serving-cert\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-audit-policies\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681784 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681802 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-client-ca\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-etcd-client\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681834 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76679b84-27e7-4a6a-b904-f399c9b7eb8d-serving-cert\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681852 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681866 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-config\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681883 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xr4\" (UniqueName: \"kubernetes.io/projected/76679b84-27e7-4a6a-b904-f399c9b7eb8d-kube-api-access-h9xr4\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681900 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-config\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681918 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681936 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8rm5\" (UniqueName: \"kubernetes.io/projected/46c511d3-25e3-422a-b0b2-099b14de9a01-kube-api-access-h8rm5\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681971 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.681987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4thm\" (UniqueName: \"kubernetes.io/projected/e8b58414-93da-4fc9-904b-1886401e00c8-kube-api-access-z4thm\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682019 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682036 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf494\" (UniqueName: \"kubernetes.io/projected/e143458e-3d6b-4b61-8811-c462db11f97f-kube-api-access-tf494\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682054 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1f6a9a4-0043-442a-9f1a-6661546d2397-audit-dir\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682057 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682071 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-etcd-serving-ca\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbnq\" (UniqueName: \"kubernetes.io/projected/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-kube-api-access-dvbnq\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-audit\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682119 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/524083e6-c56c-4c74-b700-ac668cb2022c-audit-dir\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682151 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682167 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e143458e-3d6b-4b61-8811-c462db11f97f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682185 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-config\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-client-ca\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.682164 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c1f6a9a4-0043-442a-9f1a-6661546d2397-node-pullsecrets\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.685756 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.685778 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.685922 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.685951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-service-ca-bundle\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.686000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-encryption-config\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.687085 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-config\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.688109 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-config\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.688188 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1f6a9a4-0043-442a-9f1a-6661546d2397-audit-dir\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.688576 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-etcd-serving-ca\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.689603 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.689998 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-audit\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.690632 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.690654 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-client-ca\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.690903 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.691592 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c1f6a9a4-0043-442a-9f1a-6661546d2397-image-import-ca\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.694324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-encryption-config\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.697256 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.697292 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b58414-93da-4fc9-904b-1886401e00c8-serving-cert\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.697848 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.698173 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-serving-cert\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.698384 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.698386 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.698856 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.699212 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c1f6a9a4-0043-442a-9f1a-6661546d2397-etcd-client\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.699307 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.702301 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qtwpt"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.702844 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.704307 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.705185 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.705709 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.705979 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.712730 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.712931 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.713611 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.715624 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-78wpk"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.716108 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.718076 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.721924 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kg7gq"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.725893 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.727492 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fdshh"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.728811 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.730187 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z2jfv"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.731335 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.732602 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f7zpr"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.734472 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.735577 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xghqq"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.737090 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g2xxs"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.738295 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.739400 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.740605 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.741799 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.742923 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tqc8v"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.743991 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.745177 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.745226 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.746405 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8tq5m"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.747500 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.748487 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.749531 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8qd8h"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.750741 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ssm2l"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.751354 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.751562 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.759306 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.761033 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.762455 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.763821 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-78wpk"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.765370 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.765908 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.767321 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.769117 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f7phv"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.771769 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ssm2l"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.773313 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.775201 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gsx4x"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.776864 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vl4vz"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.777040 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.777264 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.778419 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vl4vz"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.779722 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gsx4x"] Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.785980 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.786602 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.786654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.786804 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.786840 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf494\" (UniqueName: \"kubernetes.io/projected/e143458e-3d6b-4b61-8811-c462db11f97f-kube-api-access-tf494\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/524083e6-c56c-4c74-b700-ac668cb2022c-audit-dir\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787389 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787414 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbnq\" (UniqueName: \"kubernetes.io/projected/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-kube-api-access-dvbnq\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787440 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e143458e-3d6b-4b61-8811-c462db11f97f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787451 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/524083e6-c56c-4c74-b700-ac668cb2022c-audit-dir\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787509 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787540 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-config\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787562 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-client-ca\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787611 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-service-ca-bundle\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787664 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99b7a779-8943-4774-b15e-959fa326d08d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v6gpp\" (UID: \"99b7a779-8943-4774-b15e-959fa326d08d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.787954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788309 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nt9q\" (UniqueName: \"kubernetes.io/projected/99b7a779-8943-4774-b15e-959fa326d08d-kube-api-access-4nt9q\") pod \"cluster-samples-operator-665b6dd947-v6gpp\" (UID: \"99b7a779-8943-4774-b15e-959fa326d08d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788342 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788372 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788411 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvgsj\" (UniqueName: \"kubernetes.io/projected/524083e6-c56c-4c74-b700-ac668cb2022c-kube-api-access-wvgsj\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788438 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c511d3-25e3-422a-b0b2-099b14de9a01-serving-cert\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e143458e-3d6b-4b61-8811-c462db11f97f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-audit-policies\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76679b84-27e7-4a6a-b904-f399c9b7eb8d-serving-cert\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788580 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-service-ca-bundle\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788614 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e143458e-3d6b-4b61-8811-c462db11f97f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788711 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788735 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-config\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788760 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xr4\" (UniqueName: \"kubernetes.io/projected/76679b84-27e7-4a6a-b904-f399c9b7eb8d-kube-api-access-h9xr4\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788792 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788819 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788845 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8rm5\" (UniqueName: \"kubernetes.io/projected/46c511d3-25e3-422a-b0b2-099b14de9a01-kube-api-access-h8rm5\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788855 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-client-ca\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.788885 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.789103 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-config\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.789122 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.789310 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.789781 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-audit-policies\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.789801 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46c511d3-25e3-422a-b0b2-099b14de9a01-config\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.790073 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.790113 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.791395 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e143458e-3d6b-4b61-8811-c462db11f97f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.791958 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99b7a779-8943-4774-b15e-959fa326d08d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v6gpp\" (UID: \"99b7a779-8943-4774-b15e-959fa326d08d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.791999 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.792058 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46c511d3-25e3-422a-b0b2-099b14de9a01-serving-cert\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.792323 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76679b84-27e7-4a6a-b904-f399c9b7eb8d-serving-cert\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.792732 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.793308 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.794022 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.794051 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.794400 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.797857 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.798462 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.798525 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.806160 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.825068 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.844903 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.865831 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.885309 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.905986 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.925168 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.965747 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 10 06:53:42 crc kubenswrapper[4732]: I1010 06:53:42.986038 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.005355 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.025417 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.045382 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.066252 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.086083 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.106161 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.125900 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.145365 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.165440 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.186266 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.205279 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.225424 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.245864 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.265883 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.286123 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.305830 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.324873 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.352830 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.365051 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.385514 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.406848 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.426171 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.446312 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.469402 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.507716 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.509068 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.525819 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.545987 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.566115 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.585962 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.606573 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.625926 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.646356 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.663972 4732 request.go:700] Waited for 1.008329328s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.666621 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.685586 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.713302 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.726191 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.745751 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.766292 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.785703 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.805342 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.825319 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.845719 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.865411 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.886215 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.923785 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4thm\" (UniqueName: \"kubernetes.io/projected/e8b58414-93da-4fc9-904b-1886401e00c8-kube-api-access-z4thm\") pod \"controller-manager-879f6c89f-z6srp\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.941040 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjv9w\" (UniqueName: \"kubernetes.io/projected/c1f6a9a4-0043-442a-9f1a-6661546d2397-kube-api-access-mjv9w\") pod \"apiserver-76f77b778f-2lfbx\" (UID: \"c1f6a9a4-0043-442a-9f1a-6661546d2397\") " pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.946778 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.966678 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.986297 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.988441 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:43 crc kubenswrapper[4732]: I1010 06:53:43.997558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.006340 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.026396 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.048158 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.074187 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.086301 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.105622 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.125621 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.146274 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.166861 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2lfbx"] Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.167149 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: W1010 06:53:44.177986 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f6a9a4_0043_442a_9f1a_6661546d2397.slice/crio-4f9511efec76d8c5fb732bc845a83d9623ac64fb36e678c6128058886549514a WatchSource:0}: Error finding container 4f9511efec76d8c5fb732bc845a83d9623ac64fb36e678c6128058886549514a: Status 404 returned error can't find the container with id 4f9511efec76d8c5fb732bc845a83d9623ac64fb36e678c6128058886549514a Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.187340 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.196058 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6srp"] Oct 10 06:53:44 crc kubenswrapper[4732]: W1010 06:53:44.203056 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8b58414_93da_4fc9_904b_1886401e00c8.slice/crio-d68157c85424df9af606f9d206aecc46b259202642c4ac1adb5def9fb98fa6bd WatchSource:0}: Error finding container d68157c85424df9af606f9d206aecc46b259202642c4ac1adb5def9fb98fa6bd: Status 404 returned error can't find the container with id d68157c85424df9af606f9d206aecc46b259202642c4ac1adb5def9fb98fa6bd Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.205420 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.225167 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.248001 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.266379 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.285942 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.305400 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.326917 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.346001 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.374447 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" event={"ID":"e8b58414-93da-4fc9-904b-1886401e00c8","Type":"ContainerStarted","Data":"06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391"} Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.374487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" event={"ID":"e8b58414-93da-4fc9-904b-1886401e00c8","Type":"ContainerStarted","Data":"d68157c85424df9af606f9d206aecc46b259202642c4ac1adb5def9fb98fa6bd"} Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.375213 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.381638 4732 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z6srp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.381678 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" podUID="e8b58414-93da-4fc9-904b-1886401e00c8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.382443 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" event={"ID":"c1f6a9a4-0043-442a-9f1a-6661546d2397","Type":"ContainerStarted","Data":"4f9511efec76d8c5fb732bc845a83d9623ac64fb36e678c6128058886549514a"} Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.385917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.405916 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.425286 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.446326 4732 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.465444 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.486272 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.506042 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.525946 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.545565 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.566266 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.601361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf494\" (UniqueName: \"kubernetes.io/projected/e143458e-3d6b-4b61-8811-c462db11f97f-kube-api-access-tf494\") pod \"openshift-apiserver-operator-796bbdcf4f-4cfw4\" (UID: \"e143458e-3d6b-4b61-8811-c462db11f97f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.609425 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.633543 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbnq\" (UniqueName: \"kubernetes.io/projected/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-kube-api-access-dvbnq\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.639234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5e93139-328d-4d3c-bc5d-7c25d67f51d2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vvpcw\" (UID: \"d5e93139-328d-4d3c-bc5d-7c25d67f51d2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.659944 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvgsj\" (UniqueName: \"kubernetes.io/projected/524083e6-c56c-4c74-b700-ac668cb2022c-kube-api-access-wvgsj\") pod \"oauth-openshift-558db77b4-8qd8h\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.664228 4732 request.go:700] Waited for 1.875428342s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.697463 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nt9q\" (UniqueName: \"kubernetes.io/projected/99b7a779-8943-4774-b15e-959fa326d08d-kube-api-access-4nt9q\") pod \"cluster-samples-operator-665b6dd947-v6gpp\" (UID: \"99b7a779-8943-4774-b15e-959fa326d08d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.702903 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.705334 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8rm5\" (UniqueName: \"kubernetes.io/projected/46c511d3-25e3-422a-b0b2-099b14de9a01-kube-api-access-h8rm5\") pod \"authentication-operator-69f744f599-fdshh\" (UID: \"46c511d3-25e3-422a-b0b2-099b14de9a01\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.723532 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xr4\" (UniqueName: \"kubernetes.io/projected/76679b84-27e7-4a6a-b904-f399c9b7eb8d-kube-api-access-h9xr4\") pod \"route-controller-manager-6576b87f9c-l858p\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.748499 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.780619 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4"] Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.814938 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-serving-cert\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821276 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05b8f5ea-d132-4c18-ac52-00fbac36d987-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821296 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-trusted-ca\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821332 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821355 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97c6\" (UniqueName: \"kubernetes.io/projected/e5164ff6-32ed-4b15-a74a-bd7783dbceea-kube-api-access-d97c6\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821379 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-oauth-serving-cert\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821431 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-tls\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821456 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca006e78-0280-497a-9e0b-1c52edc29e45-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.821618 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.822516 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.822563 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-bound-sa-token\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.822674 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e64e809-d579-480a-bfed-24473604cff0-images\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.822780 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b8f5ea-d132-4c18-ac52-00fbac36d987-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.823557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-service-ca\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.823577 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-trusted-ca-bundle\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.823641 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dec63d28-86dc-4410-87a5-b7837f0d7070-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wv8n4\" (UID: \"dec63d28-86dc-4410-87a5-b7837f0d7070\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.823661 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5164ff6-32ed-4b15-a74a-bd7783dbceea-trusted-ca\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.823756 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-client\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.824026 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-serving-cert\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.824046 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5bw\" (UniqueName: \"kubernetes.io/projected/45674fdf-b85c-4d66-afc3-b0fad73523da-kube-api-access-xx5bw\") pod \"downloads-7954f5f757-f7zpr\" (UID: \"45674fdf-b85c-4d66-afc3-b0fad73523da\") " pod="openshift-console/downloads-7954f5f757-f7zpr" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.824085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5164ff6-32ed-4b15-a74a-bd7783dbceea-serving-cert\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.824180 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-certificates\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.824223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-audit-policies\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825230 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca006e78-0280-497a-9e0b-1c52edc29e45-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825253 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trp2k\" (UniqueName: \"kubernetes.io/projected/dec63d28-86dc-4410-87a5-b7837f0d7070-kube-api-access-trp2k\") pod \"control-plane-machine-set-operator-78cbb6b69f-wv8n4\" (UID: \"dec63d28-86dc-4410-87a5-b7837f0d7070\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825275 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825290 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/209fb0dc-d6b6-476d-b160-6a0052080df5-auth-proxy-config\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825314 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-oauth-config\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825350 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-serving-cert\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5164ff6-32ed-4b15-a74a-bd7783dbceea-config\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825382 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825397 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-259f4\" (UniqueName: \"kubernetes.io/projected/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-kube-api-access-259f4\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825415 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nznf4\" (UniqueName: \"kubernetes.io/projected/3c038cc2-3a8f-43e9-afc6-8b22acca9266-kube-api-access-nznf4\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzqnq\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-kube-api-access-lzqnq\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825506 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g75tf\" (UniqueName: \"kubernetes.io/projected/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-kube-api-access-g75tf\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.825642 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-service-ca\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826205 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c038cc2-3a8f-43e9-afc6-8b22acca9266-audit-dir\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: E1010 06:53:44.826241 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.326207945 +0000 UTC m=+152.395799176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826299 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e64e809-d579-480a-bfed-24473604cff0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826357 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-console-config\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca006e78-0280-497a-9e0b-1c52edc29e45-config\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-ca\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826463 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/209fb0dc-d6b6-476d-b160-6a0052080df5-machine-approver-tls\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/209fb0dc-d6b6-476d-b160-6a0052080df5-config\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826552 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87szw\" (UniqueName: \"kubernetes.io/projected/3e64e809-d579-480a-bfed-24473604cff0-kube-api-access-87szw\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826586 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-serving-cert\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826609 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggzcr\" (UniqueName: \"kubernetes.io/projected/209fb0dc-d6b6-476d-b160-6a0052080df5-kube-api-access-ggzcr\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826630 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj29j\" (UniqueName: \"kubernetes.io/projected/e7a62711-6cb6-4867-a232-8b8b043faa74-kube-api-access-bj29j\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-config\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826719 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b8f5ea-d132-4c18-ac52-00fbac36d987-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826748 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-etcd-client\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826772 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-encryption-config\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.826826 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64e809-d579-480a-bfed-24473604cff0-config\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.890758 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8qd8h"] Oct 10 06:53:44 crc kubenswrapper[4732]: W1010 06:53:44.898805 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524083e6_c56c_4c74_b700_ac668cb2022c.slice/crio-bf80d81a548701aee45554b4063f73093161747322aa85ae4c118c42457d69e0 WatchSource:0}: Error finding container bf80d81a548701aee45554b4063f73093161747322aa85ae4c118c42457d69e0: Status 404 returned error can't find the container with id bf80d81a548701aee45554b4063f73093161747322aa85ae4c118c42457d69e0 Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928152 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928396 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5164ff6-32ed-4b15-a74a-bd7783dbceea-trusted-ca\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-certificates\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928477 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-client\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928499 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5bw\" (UniqueName: \"kubernetes.io/projected/45674fdf-b85c-4d66-afc3-b0fad73523da-kube-api-access-xx5bw\") pod \"downloads-7954f5f757-f7zpr\" (UID: \"45674fdf-b85c-4d66-afc3-b0fad73523da\") " pod="openshift-console/downloads-7954f5f757-f7zpr" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5164ff6-32ed-4b15-a74a-bd7783dbceea-serving-cert\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928548 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-serving-cert\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928571 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-audit-policies\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928596 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vtr\" (UniqueName: \"kubernetes.io/projected/466cd789-08e1-413b-a590-8c3e3fb3cb40-kube-api-access-t2vtr\") pod \"package-server-manager-789f6589d5-v7gcd\" (UID: \"466cd789-08e1-413b-a590-8c3e3fb3cb40\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928616 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5ae2db-8e0c-4f4e-bef0-9798a6c09683-cert\") pod \"ingress-canary-vl4vz\" (UID: \"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683\") " pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928650 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pph\" (UniqueName: \"kubernetes.io/projected/471e7f45-877a-4cba-8e27-b2a249dac74e-kube-api-access-45pph\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928675 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll797\" (UniqueName: \"kubernetes.io/projected/e9043fa5-3781-44a6-ab0d-1e5f98755081-kube-api-access-ll797\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928729 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-srv-cert\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928768 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trp2k\" (UniqueName: \"kubernetes.io/projected/dec63d28-86dc-4410-87a5-b7837f0d7070-kube-api-access-trp2k\") pod \"control-plane-machine-set-operator-78cbb6b69f-wv8n4\" (UID: \"dec63d28-86dc-4410-87a5-b7837f0d7070\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928793 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/209fb0dc-d6b6-476d-b160-6a0052080df5-auth-proxy-config\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928815 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/52ff08f1-7920-48d3-ae45-54b18ea49dd2-profile-collector-cert\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928841 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-oauth-config\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928895 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-serving-cert\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2t2\" (UniqueName: \"kubernetes.io/projected/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-kube-api-access-6m2t2\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.928941 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b325ec0f-cef2-4845-a8ba-d59a06dff2ee-metrics-tls\") pod \"dns-operator-744455d44c-tqc8v\" (UID: \"b325ec0f-cef2-4845-a8ba-d59a06dff2ee\") " pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:44 crc kubenswrapper[4732]: E1010 06:53:44.928979 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.428957864 +0000 UTC m=+152.498549095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f6dc879-6002-4c56-971c-18d3d9d311a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929026 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-259f4\" (UniqueName: \"kubernetes.io/projected/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-kube-api-access-259f4\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929042 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f6dc879-6002-4c56-971c-18d3d9d311a6-tmpfs\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929057 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f543214-fd2c-4083-b253-4f9cc914a10c-trusted-ca\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929072 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzqnq\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-kube-api-access-lzqnq\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929088 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g75tf\" (UniqueName: \"kubernetes.io/projected/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-kube-api-access-g75tf\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-service-ca\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929118 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-metrics-certs\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929133 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929167 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c038cc2-3a8f-43e9-afc6-8b22acca9266-audit-dir\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929182 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2qdx\" (UniqueName: \"kubernetes.io/projected/809f770a-ef33-46b3-b29c-8b98fb0440fe-kube-api-access-r2qdx\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929197 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-socket-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m678s\" (UniqueName: \"kubernetes.io/projected/3f543214-fd2c-4083-b253-4f9cc914a10c-kube-api-access-m678s\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929252 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-ca\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929275 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-serving-cert\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929299 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggzcr\" (UniqueName: \"kubernetes.io/projected/209fb0dc-d6b6-476d-b160-6a0052080df5-kube-api-access-ggzcr\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929318 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4c17a98-44ae-4995-9bcc-c398f3b9a476-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929335 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p889q\" (UniqueName: \"kubernetes.io/projected/2e5ae2db-8e0c-4f4e-bef0-9798a6c09683-kube-api-access-p889q\") pod \"ingress-canary-vl4vz\" (UID: \"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683\") " pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929361 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9043fa5-3781-44a6-ab0d-1e5f98755081-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929378 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929394 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2ks\" (UniqueName: \"kubernetes.io/projected/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-kube-api-access-5s2ks\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929410 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/809f770a-ef33-46b3-b29c-8b98fb0440fe-signing-key\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929425 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwhmb\" (UniqueName: \"kubernetes.io/projected/d803f9bf-03e8-4757-9bc2-94692dae48b6-kube-api-access-rwhmb\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76e4bae7-5083-477d-ac35-4ab579a104ba-secret-volume\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-serving-cert\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929471 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05b8f5ea-d132-4c18-ac52-00fbac36d987-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929486 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-trusted-ca\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929500 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929514 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97c6\" (UniqueName: \"kubernetes.io/projected/e5164ff6-32ed-4b15-a74a-bd7783dbceea-kube-api-access-d97c6\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929531 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d803f9bf-03e8-4757-9bc2-94692dae48b6-images\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929546 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w29s\" (UniqueName: \"kubernetes.io/projected/52ff08f1-7920-48d3-ae45-54b18ea49dd2-kube-api-access-7w29s\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929569 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/466cd789-08e1-413b-a590-8c3e3fb3cb40-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v7gcd\" (UID: \"466cd789-08e1-413b-a590-8c3e3fb3cb40\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929586 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-plugins-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929603 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-config\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929619 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d803f9bf-03e8-4757-9bc2-94692dae48b6-proxy-tls\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-certs\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929651 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929667 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-bound-sa-token\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929683 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e64e809-d579-480a-bfed-24473604cff0-images\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b8f5ea-d132-4c18-ac52-00fbac36d987-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9043fa5-3781-44a6-ab0d-1e5f98755081-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929756 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrx2\" (UniqueName: \"kubernetes.io/projected/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-kube-api-access-kdrx2\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929771 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-config\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929790 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-service-ca\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929804 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929822 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dec63d28-86dc-4410-87a5-b7837f0d7070-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wv8n4\" (UID: \"dec63d28-86dc-4410-87a5-b7837f0d7070\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f543214-fd2c-4083-b253-4f9cc914a10c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929856 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-serving-cert\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929872 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjl9z\" (UniqueName: \"kubernetes.io/projected/8f6dc879-6002-4c56-971c-18d3d9d311a6-kube-api-access-fjl9z\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929932 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4c17a98-44ae-4995-9bcc-c398f3b9a476-proxy-tls\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.929971 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.930012 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca006e78-0280-497a-9e0b-1c52edc29e45-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.930032 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-mountpoint-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.930048 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-registration-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.930068 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.930118 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w42zs\" (UniqueName: \"kubernetes.io/projected/5830df43-dc59-4583-8492-fddb895a4266-kube-api-access-w42zs\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931591 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e5164ff6-32ed-4b15-a74a-bd7783dbceea-trusted-ca\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931624 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5164ff6-32ed-4b15-a74a-bd7783dbceea-config\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931660 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931685 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-metrics-tls\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931724 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c867t\" (UniqueName: \"kubernetes.io/projected/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-kube-api-access-c867t\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931753 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931783 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nznf4\" (UniqueName: \"kubernetes.io/projected/3c038cc2-3a8f-43e9-afc6-8b22acca9266-kube-api-access-nznf4\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931807 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-config-volume\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931844 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e64e809-d579-480a-bfed-24473604cff0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931866 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-console-config\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931893 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca006e78-0280-497a-9e0b-1c52edc29e45-config\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931917 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-csi-data-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/209fb0dc-d6b6-476d-b160-6a0052080df5-machine-approver-tls\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931962 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/209fb0dc-d6b6-476d-b160-6a0052080df5-config\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87szw\" (UniqueName: \"kubernetes.io/projected/3e64e809-d579-480a-bfed-24473604cff0-kube-api-access-87szw\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.931985 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/209fb0dc-d6b6-476d-b160-6a0052080df5-auth-proxy-config\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932005 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f6dc879-6002-4c56-971c-18d3d9d311a6-webhook-cert\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932059 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932085 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj29j\" (UniqueName: \"kubernetes.io/projected/e7a62711-6cb6-4867-a232-8b8b043faa74-kube-api-access-bj29j\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52xb7\" (UniqueName: \"kubernetes.io/projected/b325ec0f-cef2-4845-a8ba-d59a06dff2ee-kube-api-access-52xb7\") pod \"dns-operator-744455d44c-tqc8v\" (UID: \"b325ec0f-cef2-4845-a8ba-d59a06dff2ee\") " pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932133 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-config\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b8f5ea-d132-4c18-ac52-00fbac36d987-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932200 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-etcd-client\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932227 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-encryption-config\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932271 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64e809-d579-480a-bfed-24473604cff0-config\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-default-certificate\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932477 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqjw\" (UniqueName: \"kubernetes.io/projected/76e4bae7-5083-477d-ac35-4ab579a104ba-kube-api-access-fkqjw\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pqmx\" (UniqueName: \"kubernetes.io/projected/658be8cb-31b1-4b7f-a96b-e23d029a5365-kube-api-access-4pqmx\") pod \"multus-admission-controller-857f4d67dd-f7phv\" (UID: \"658be8cb-31b1-4b7f-a96b-e23d029a5365\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-oauth-serving-cert\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-tls\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932619 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-node-bootstrap-token\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932646 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/52ff08f1-7920-48d3-ae45-54b18ea49dd2-srv-cert\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932669 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932771 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca006e78-0280-497a-9e0b-1c52edc29e45-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932795 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/809f770a-ef33-46b3-b29c-8b98fb0440fe-signing-cabundle\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932818 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f543214-fd2c-4083-b253-4f9cc914a10c-metrics-tls\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932841 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/658be8cb-31b1-4b7f-a96b-e23d029a5365-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f7phv\" (UID: \"658be8cb-31b1-4b7f-a96b-e23d029a5365\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932867 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxng\" (UniqueName: \"kubernetes.io/projected/84e57a66-98b6-44af-87b4-d3fcf39fa72b-kube-api-access-wcxng\") pod \"migrator-59844c95c7-jh5dr\" (UID: \"84e57a66-98b6-44af-87b4-d3fcf39fa72b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd55w\" (UniqueName: \"kubernetes.io/projected/f4c17a98-44ae-4995-9bcc-c398f3b9a476-kube-api-access-qd55w\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932944 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/471e7f45-877a-4cba-8e27-b2a249dac74e-service-ca-bundle\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932974 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76e4bae7-5083-477d-ac35-4ab579a104ba-config-volume\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933000 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-stats-auth\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933031 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-service-ca\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-trusted-ca-bundle\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933114 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fd9r\" (UniqueName: \"kubernetes.io/projected/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-kube-api-access-7fd9r\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933142 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d803f9bf-03e8-4757-9bc2-94692dae48b6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933169 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qdnx\" (UniqueName: \"kubernetes.io/projected/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-kube-api-access-9qdnx\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933459 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933845 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-oauth-config\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933946 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-trusted-ca-bundle\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: E1010 06:53:44.934079 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.434067808 +0000 UTC m=+152.503659049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.935073 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c038cc2-3a8f-43e9-afc6-8b22acca9266-audit-dir\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.935593 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-oauth-serving-cert\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.936799 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-trusted-ca\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.937019 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca006e78-0280-497a-9e0b-1c52edc29e45-config\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.937179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-client\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.937613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.933111 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5164ff6-32ed-4b15-a74a-bd7783dbceea-config\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.932945 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-certificates\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.938642 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-ca\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.938710 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.938931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-ca-trust-extracted\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.939531 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c038cc2-3a8f-43e9-afc6-8b22acca9266-audit-policies\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.939644 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3e64e809-d579-480a-bfed-24473604cff0-images\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.940407 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-config\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.940450 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05b8f5ea-d132-4c18-ac52-00fbac36d987-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.941005 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-etcd-service-ca\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.941364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca006e78-0280-497a-9e0b-1c52edc29e45-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.941546 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/209fb0dc-d6b6-476d-b160-6a0052080df5-config\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.942868 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e64e809-d579-480a-bfed-24473604cff0-config\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.943057 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-console-config\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.943913 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp"] Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.944349 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-tls\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.944886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-serving-cert\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.944950 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-etcd-client\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.945187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dec63d28-86dc-4410-87a5-b7837f0d7070-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wv8n4\" (UID: \"dec63d28-86dc-4410-87a5-b7837f0d7070\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.945557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-encryption-config\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.945980 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/209fb0dc-d6b6-476d-b160-6a0052080df5-machine-approver-tls\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.946536 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e64e809-d579-480a-bfed-24473604cff0-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.946940 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-serving-cert\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.948207 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-serving-cert\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.950057 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c038cc2-3a8f-43e9-afc6-8b22acca9266-serving-cert\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.951141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5164ff6-32ed-4b15-a74a-bd7783dbceea-serving-cert\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.951727 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05b8f5ea-d132-4c18-ac52-00fbac36d987-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.955887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-installation-pull-secrets\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.963666 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzqnq\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-kube-api-access-lzqnq\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.981081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g75tf\" (UniqueName: \"kubernetes.io/projected/3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d-kube-api-access-g75tf\") pod \"etcd-operator-b45778765-xghqq\" (UID: \"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:44 crc kubenswrapper[4732]: I1010 06:53:44.981699 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.011132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trp2k\" (UniqueName: \"kubernetes.io/projected/dec63d28-86dc-4410-87a5-b7837f0d7070-kube-api-access-trp2k\") pod \"control-plane-machine-set-operator-78cbb6b69f-wv8n4\" (UID: \"dec63d28-86dc-4410-87a5-b7837f0d7070\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.015453 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.023371 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nznf4\" (UniqueName: \"kubernetes.io/projected/3c038cc2-3a8f-43e9-afc6-8b22acca9266-kube-api-access-nznf4\") pod \"apiserver-7bbb656c7d-zbqvm\" (UID: \"3c038cc2-3a8f-43e9-afc6-8b22acca9266\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.030971 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.034634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.034810 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.534782863 +0000 UTC m=+152.604374114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.034850 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqjw\" (UniqueName: \"kubernetes.io/projected/76e4bae7-5083-477d-ac35-4ab579a104ba-kube-api-access-fkqjw\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.034896 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pqmx\" (UniqueName: \"kubernetes.io/projected/658be8cb-31b1-4b7f-a96b-e23d029a5365-kube-api-access-4pqmx\") pod \"multus-admission-controller-857f4d67dd-f7phv\" (UID: \"658be8cb-31b1-4b7f-a96b-e23d029a5365\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.034932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-node-bootstrap-token\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.034957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/52ff08f1-7920-48d3-ae45-54b18ea49dd2-srv-cert\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.034984 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035010 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035039 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/809f770a-ef33-46b3-b29c-8b98fb0440fe-signing-cabundle\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f543214-fd2c-4083-b253-4f9cc914a10c-metrics-tls\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035089 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/658be8cb-31b1-4b7f-a96b-e23d029a5365-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f7phv\" (UID: \"658be8cb-31b1-4b7f-a96b-e23d029a5365\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035116 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcxng\" (UniqueName: \"kubernetes.io/projected/84e57a66-98b6-44af-87b4-d3fcf39fa72b-kube-api-access-wcxng\") pod \"migrator-59844c95c7-jh5dr\" (UID: \"84e57a66-98b6-44af-87b4-d3fcf39fa72b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035146 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd55w\" (UniqueName: \"kubernetes.io/projected/f4c17a98-44ae-4995-9bcc-c398f3b9a476-kube-api-access-qd55w\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035170 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/471e7f45-877a-4cba-8e27-b2a249dac74e-service-ca-bundle\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035192 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76e4bae7-5083-477d-ac35-4ab579a104ba-config-volume\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-stats-auth\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035226 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fd9r\" (UniqueName: \"kubernetes.io/projected/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-kube-api-access-7fd9r\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d803f9bf-03e8-4757-9bc2-94692dae48b6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035264 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qdnx\" (UniqueName: \"kubernetes.io/projected/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-kube-api-access-9qdnx\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-serving-cert\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035310 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vtr\" (UniqueName: \"kubernetes.io/projected/466cd789-08e1-413b-a590-8c3e3fb3cb40-kube-api-access-t2vtr\") pod \"package-server-manager-789f6589d5-v7gcd\" (UID: \"466cd789-08e1-413b-a590-8c3e3fb3cb40\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035326 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5ae2db-8e0c-4f4e-bef0-9798a6c09683-cert\") pod \"ingress-canary-vl4vz\" (UID: \"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683\") " pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035343 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45pph\" (UniqueName: \"kubernetes.io/projected/471e7f45-877a-4cba-8e27-b2a249dac74e-kube-api-access-45pph\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035360 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll797\" (UniqueName: \"kubernetes.io/projected/e9043fa5-3781-44a6-ab0d-1e5f98755081-kube-api-access-ll797\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035381 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-srv-cert\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035411 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/52ff08f1-7920-48d3-ae45-54b18ea49dd2-profile-collector-cert\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035441 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2t2\" (UniqueName: \"kubernetes.io/projected/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-kube-api-access-6m2t2\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035463 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f6dc879-6002-4c56-971c-18d3d9d311a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035488 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b325ec0f-cef2-4845-a8ba-d59a06dff2ee-metrics-tls\") pod \"dns-operator-744455d44c-tqc8v\" (UID: \"b325ec0f-cef2-4845-a8ba-d59a06dff2ee\") " pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035516 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f6dc879-6002-4c56-971c-18d3d9d311a6-tmpfs\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035535 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f543214-fd2c-4083-b253-4f9cc914a10c-trusted-ca\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035560 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-metrics-certs\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2qdx\" (UniqueName: \"kubernetes.io/projected/809f770a-ef33-46b3-b29c-8b98fb0440fe-kube-api-access-r2qdx\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-socket-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035655 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m678s\" (UniqueName: \"kubernetes.io/projected/3f543214-fd2c-4083-b253-4f9cc914a10c-kube-api-access-m678s\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035712 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4c17a98-44ae-4995-9bcc-c398f3b9a476-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035732 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p889q\" (UniqueName: \"kubernetes.io/projected/2e5ae2db-8e0c-4f4e-bef0-9798a6c09683-kube-api-access-p889q\") pod \"ingress-canary-vl4vz\" (UID: \"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683\") " pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035751 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9043fa5-3781-44a6-ab0d-1e5f98755081-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035767 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035787 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2ks\" (UniqueName: \"kubernetes.io/projected/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-kube-api-access-5s2ks\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035803 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/809f770a-ef33-46b3-b29c-8b98fb0440fe-signing-key\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035818 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwhmb\" (UniqueName: \"kubernetes.io/projected/d803f9bf-03e8-4757-9bc2-94692dae48b6-kube-api-access-rwhmb\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035835 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76e4bae7-5083-477d-ac35-4ab579a104ba-secret-volume\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035863 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d803f9bf-03e8-4757-9bc2-94692dae48b6-images\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035891 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w29s\" (UniqueName: \"kubernetes.io/projected/52ff08f1-7920-48d3-ae45-54b18ea49dd2-kube-api-access-7w29s\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035909 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/466cd789-08e1-413b-a590-8c3e3fb3cb40-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v7gcd\" (UID: \"466cd789-08e1-413b-a590-8c3e3fb3cb40\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035942 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-plugins-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035966 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-config\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035968 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.035983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d803f9bf-03e8-4757-9bc2-94692dae48b6-proxy-tls\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036017 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-certs\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036044 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9043fa5-3781-44a6-ab0d-1e5f98755081-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036062 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrx2\" (UniqueName: \"kubernetes.io/projected/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-kube-api-access-kdrx2\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036081 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-config\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036101 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036119 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f543214-fd2c-4083-b253-4f9cc914a10c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036140 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjl9z\" (UniqueName: \"kubernetes.io/projected/8f6dc879-6002-4c56-971c-18d3d9d311a6-kube-api-access-fjl9z\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4c17a98-44ae-4995-9bcc-c398f3b9a476-proxy-tls\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036175 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036197 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-mountpoint-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036211 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-registration-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036228 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w42zs\" (UniqueName: \"kubernetes.io/projected/5830df43-dc59-4583-8492-fddb895a4266-kube-api-access-w42zs\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036245 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-metrics-tls\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036276 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c867t\" (UniqueName: \"kubernetes.io/projected/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-kube-api-access-c867t\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036291 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-config-volume\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036312 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-csi-data-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036331 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f6dc879-6002-4c56-971c-18d3d9d311a6-webhook-cert\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036345 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036364 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52xb7\" (UniqueName: \"kubernetes.io/projected/b325ec0f-cef2-4845-a8ba-d59a06dff2ee-kube-api-access-52xb7\") pod \"dns-operator-744455d44c-tqc8v\" (UID: \"b325ec0f-cef2-4845-a8ba-d59a06dff2ee\") " pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.036388 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-default-certificate\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.037086 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d803f9bf-03e8-4757-9bc2-94692dae48b6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.037955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/471e7f45-877a-4cba-8e27-b2a249dac74e-service-ca-bundle\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.037996 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/809f770a-ef33-46b3-b29c-8b98fb0440fe-signing-cabundle\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.038548 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76e4bae7-5083-477d-ac35-4ab579a104ba-config-volume\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.040225 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9043fa5-3781-44a6-ab0d-1e5f98755081-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.041044 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f6dc879-6002-4c56-971c-18d3d9d311a6-tmpfs\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.041510 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-mountpoint-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.041547 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.042196 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-socket-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.042330 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-stats-auth\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.043097 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4c17a98-44ae-4995-9bcc-c398f3b9a476-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.043531 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f543214-fd2c-4083-b253-4f9cc914a10c-trusted-ca\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.043768 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-registration-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.044603 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-config-volume\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.044740 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-csi-data-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.045099 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-certs\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.045508 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-default-certificate\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.046412 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.546393919 +0000 UTC m=+152.615985260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.046573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d803f9bf-03e8-4757-9bc2-94692dae48b6-proxy-tls\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.046588 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f543214-fd2c-4083-b253-4f9cc914a10c-metrics-tls\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.047343 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d803f9bf-03e8-4757-9bc2-94692dae48b6-images\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.047390 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-config\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.047769 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.047848 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9043fa5-3781-44a6-ab0d-1e5f98755081-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.049014 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-259f4\" (UniqueName: \"kubernetes.io/projected/9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8-kube-api-access-259f4\") pod \"openshift-config-operator-7777fb866f-7dnbb\" (UID: \"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.049101 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/52ff08f1-7920-48d3-ae45-54b18ea49dd2-srv-cert\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.049902 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/471e7f45-877a-4cba-8e27-b2a249dac74e-metrics-certs\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.050337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/5830df43-dc59-4583-8492-fddb895a4266-plugins-dir\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.051063 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/658be8cb-31b1-4b7f-a96b-e23d029a5365-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f7phv\" (UID: \"658be8cb-31b1-4b7f-a96b-e23d029a5365\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.051392 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e5ae2db-8e0c-4f4e-bef0-9798a6c09683-cert\") pod \"ingress-canary-vl4vz\" (UID: \"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683\") " pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.052122 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.052388 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.052794 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f6dc879-6002-4c56-971c-18d3d9d311a6-webhook-cert\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.052972 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/52ff08f1-7920-48d3-ae45-54b18ea49dd2-profile-collector-cert\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.053676 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-config\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.055661 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76e4bae7-5083-477d-ac35-4ab579a104ba-secret-volume\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.055769 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-metrics-tls\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.055995 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/466cd789-08e1-413b-a590-8c3e3fb3cb40-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-v7gcd\" (UID: \"466cd789-08e1-413b-a590-8c3e3fb3cb40\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.056324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.056381 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-node-bootstrap-token\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.056625 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.057985 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b325ec0f-cef2-4845-a8ba-d59a06dff2ee-metrics-tls\") pod \"dns-operator-744455d44c-tqc8v\" (UID: \"b325ec0f-cef2-4845-a8ba-d59a06dff2ee\") " pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.058680 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-serving-cert\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:45 crc kubenswrapper[4732]: W1010 06:53:45.058802 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5e93139_328d_4d3c_bc5d_7c25d67f51d2.slice/crio-2d37120c1a4684ec04aeb1478c98c2f445977d334386e0c85273acde152b195c WatchSource:0}: Error finding container 2d37120c1a4684ec04aeb1478c98c2f445977d334386e0c85273acde152b195c: Status 404 returned error can't find the container with id 2d37120c1a4684ec04aeb1478c98c2f445977d334386e0c85273acde152b195c Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.058866 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4c17a98-44ae-4995-9bcc-c398f3b9a476-proxy-tls\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.059224 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f6dc879-6002-4c56-971c-18d3d9d311a6-apiservice-cert\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.064598 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/809f770a-ef33-46b3-b29c-8b98fb0440fe-signing-key\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.065491 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-srv-cert\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.070249 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05b8f5ea-d132-4c18-ac52-00fbac36d987-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-4jwhf\" (UID: \"05b8f5ea-d132-4c18-ac52-00fbac36d987\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.080783 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5bw\" (UniqueName: \"kubernetes.io/projected/45674fdf-b85c-4d66-afc3-b0fad73523da-kube-api-access-xx5bw\") pod \"downloads-7954f5f757-f7zpr\" (UID: \"45674fdf-b85c-4d66-afc3-b0fad73523da\") " pod="openshift-console/downloads-7954f5f757-f7zpr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.088480 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f7zpr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.107351 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97c6\" (UniqueName: \"kubernetes.io/projected/e5164ff6-32ed-4b15-a74a-bd7783dbceea-kube-api-access-d97c6\") pod \"console-operator-58897d9998-bbf6g\" (UID: \"e5164ff6-32ed-4b15-a74a-bd7783dbceea\") " pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.121719 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-bound-sa-token\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.134243 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.137790 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.137929 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.637907741 +0000 UTC m=+152.707498982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.138177 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.138495 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.638487477 +0000 UTC m=+152.708078718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.140845 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.148419 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.148589 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87szw\" (UniqueName: \"kubernetes.io/projected/3e64e809-d579-480a-bfed-24473604cff0-kube-api-access-87szw\") pod \"machine-api-operator-5694c8668f-z2jfv\" (UID: \"3e64e809-d579-480a-bfed-24473604cff0\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.161727 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.163054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.166800 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj29j\" (UniqueName: \"kubernetes.io/projected/e7a62711-6cb6-4867-a232-8b8b043faa74-kube-api-access-bj29j\") pod \"console-f9d7485db-kg7gq\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.190990 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca006e78-0280-497a-9e0b-1c52edc29e45-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-96qlm\" (UID: \"ca006e78-0280-497a-9e0b-1c52edc29e45\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.225911 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.234728 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggzcr\" (UniqueName: \"kubernetes.io/projected/209fb0dc-d6b6-476d-b160-6a0052080df5-kube-api-access-ggzcr\") pod \"machine-approver-56656f9798-tmbsf\" (UID: \"209fb0dc-d6b6-476d-b160-6a0052080df5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.254283 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.254894 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.255842 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.755822989 +0000 UTC m=+152.825414220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.280563 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pqmx\" (UniqueName: \"kubernetes.io/projected/658be8cb-31b1-4b7f-a96b-e23d029a5365-kube-api-access-4pqmx\") pod \"multus-admission-controller-857f4d67dd-f7phv\" (UID: \"658be8cb-31b1-4b7f-a96b-e23d029a5365\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.298343 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fdshh"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.314330 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqjw\" (UniqueName: \"kubernetes.io/projected/76e4bae7-5083-477d-ac35-4ab579a104ba-kube-api-access-fkqjw\") pod \"collect-profiles-29334645-j4gb5\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.318047 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.332523 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fd9r\" (UniqueName: \"kubernetes.io/projected/8359b8ee-4138-4f5e-93e6-438c6c1aba4d-kube-api-access-7fd9r\") pod \"machine-config-server-qtwpt\" (UID: \"8359b8ee-4138-4f5e-93e6-438c6c1aba4d\") " pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.333407 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd55w\" (UniqueName: \"kubernetes.io/projected/f4c17a98-44ae-4995-9bcc-c398f3b9a476-kube-api-access-qd55w\") pod \"machine-config-controller-84d6567774-lhkx9\" (UID: \"f4c17a98-44ae-4995-9bcc-c398f3b9a476\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.356728 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.369279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.373970 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjl9z\" (UniqueName: \"kubernetes.io/projected/8f6dc879-6002-4c56-971c-18d3d9d311a6-kube-api-access-fjl9z\") pod \"packageserver-d55dfcdfc-vrj7r\" (UID: \"8f6dc879-6002-4c56-971c-18d3d9d311a6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.374291 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f543214-fd2c-4083-b253-4f9cc914a10c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.384808 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.857223922 +0000 UTC m=+152.926815163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.387452 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.391586 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lnn79\" (UID: \"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.398665 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2qdx\" (UniqueName: \"kubernetes.io/projected/809f770a-ef33-46b3-b29c-8b98fb0440fe-kube-api-access-r2qdx\") pod \"service-ca-9c57cc56f-78wpk\" (UID: \"809f770a-ef33-46b3-b29c-8b98fb0440fe\") " pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.402036 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" event={"ID":"209fb0dc-d6b6-476d-b160-6a0052080df5","Type":"ContainerStarted","Data":"27de21b10ec816bcad0a985559f2196ab1821edfc502c22732f235aee6e3caaf"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.403231 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" event={"ID":"46c511d3-25e3-422a-b0b2-099b14de9a01","Type":"ContainerStarted","Data":"1169c09e985a2bc59bde67305c254b03f70ad98bc313f7b991b4e577575df446"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.416955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m678s\" (UniqueName: \"kubernetes.io/projected/3f543214-fd2c-4083-b253-4f9cc914a10c-kube-api-access-m678s\") pod \"ingress-operator-5b745b69d9-dq6ts\" (UID: \"3f543214-fd2c-4083-b253-4f9cc914a10c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.432802 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p889q\" (UniqueName: \"kubernetes.io/projected/2e5ae2db-8e0c-4f4e-bef0-9798a6c09683-kube-api-access-p889q\") pod \"ingress-canary-vl4vz\" (UID: \"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683\") " pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.447721 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.454183 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c867t\" (UniqueName: \"kubernetes.io/projected/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-kube-api-access-c867t\") pod \"marketplace-operator-79b997595-g2xxs\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.457568 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.457886 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.957869975 +0000 UTC m=+153.027461216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.460499 4732 generic.go:334] "Generic (PLEG): container finished" podID="c1f6a9a4-0043-442a-9f1a-6661546d2397" containerID="dfa63d0cf94162b4ffa87c6fab21fa6e93228294d28a807a7f9081a366736895" exitCode=0 Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.460664 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" event={"ID":"c1f6a9a4-0043-442a-9f1a-6661546d2397","Type":"ContainerDied","Data":"dfa63d0cf94162b4ffa87c6fab21fa6e93228294d28a807a7f9081a366736895"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.460959 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.461564 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:45.961553722 +0000 UTC m=+153.031144963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.474263 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w42zs\" (UniqueName: \"kubernetes.io/projected/5830df43-dc59-4583-8492-fddb895a4266-kube-api-access-w42zs\") pod \"csi-hostpathplugin-gsx4x\" (UID: \"5830df43-dc59-4583-8492-fddb895a4266\") " pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.488823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" event={"ID":"524083e6-c56c-4c74-b700-ac668cb2022c","Type":"ContainerStarted","Data":"a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.488875 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" event={"ID":"524083e6-c56c-4c74-b700-ac668cb2022c","Type":"ContainerStarted","Data":"bf80d81a548701aee45554b4063f73093161747322aa85ae4c118c42457d69e0"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.489395 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.492231 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qdnx\" (UniqueName: \"kubernetes.io/projected/6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b-kube-api-access-9qdnx\") pod \"kube-storage-version-migrator-operator-b67b599dd-nw2jr\" (UID: \"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.497254 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" event={"ID":"d5e93139-328d-4d3c-bc5d-7c25d67f51d2","Type":"ContainerStarted","Data":"3776e1270db63b6e4e1b470cee790ed3521e8cf049287cfcf2bf4466143b2c6e"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.497291 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" event={"ID":"d5e93139-328d-4d3c-bc5d-7c25d67f51d2","Type":"ContainerStarted","Data":"2d37120c1a4684ec04aeb1478c98c2f445977d334386e0c85273acde152b195c"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.504889 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcxng\" (UniqueName: \"kubernetes.io/projected/84e57a66-98b6-44af-87b4-d3fcf39fa72b-kube-api-access-wcxng\") pod \"migrator-59844c95c7-jh5dr\" (UID: \"84e57a66-98b6-44af-87b4-d3fcf39fa72b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.506042 4732 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8qd8h container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.506071 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" podUID="524083e6-c56c-4c74-b700-ac668cb2022c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.507885 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-bbf6g"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.533510 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" event={"ID":"99b7a779-8943-4774-b15e-959fa326d08d","Type":"ContainerStarted","Data":"c77b5cc49880e94b9e1b99c1724f0f327b1ef5e113b1b44df281391f92522fc4"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.533549 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" event={"ID":"99b7a779-8943-4774-b15e-959fa326d08d","Type":"ContainerStarted","Data":"3dc2db87c35ba6c93821d58e743526dd98cf45e70091b8dc960b77a89b2afdf9"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.533756 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.535665 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.540560 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll797\" (UniqueName: \"kubernetes.io/projected/e9043fa5-3781-44a6-ab0d-1e5f98755081-kube-api-access-ll797\") pod \"openshift-controller-manager-operator-756b6f6bc6-2q7sx\" (UID: \"e9043fa5-3781-44a6-ab0d-1e5f98755081\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.544983 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" event={"ID":"e143458e-3d6b-4b61-8811-c462db11f97f","Type":"ContainerStarted","Data":"f5b84774b44b876cf98c20e34fcae860863cf16206b5b3269aa22620bf7c5ed0"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.545021 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" event={"ID":"e143458e-3d6b-4b61-8811-c462db11f97f","Type":"ContainerStarted","Data":"99e717671722023342e22bad2c95867ce68d942a5f15a5dd5d8823d5bffa12ff"} Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.554785 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.561911 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.562544 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.062529064 +0000 UTC m=+153.132120305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.563538 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.564078 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vtr\" (UniqueName: \"kubernetes.io/projected/466cd789-08e1-413b-a590-8c3e3fb3cb40-kube-api-access-t2vtr\") pod \"package-server-manager-789f6589d5-v7gcd\" (UID: \"466cd789-08e1-413b-a590-8c3e3fb3cb40\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.566846 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrx2\" (UniqueName: \"kubernetes.io/projected/4d3e2cf8-080a-42b5-8c26-1336d0279fd4-kube-api-access-kdrx2\") pod \"dns-default-ssm2l\" (UID: \"4d3e2cf8-080a-42b5-8c26-1336d0279fd4\") " pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.571554 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.578237 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.587637 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.599386 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.607410 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.609304 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2ks\" (UniqueName: \"kubernetes.io/projected/7d7fb490-d6a4-4bc8-980c-d0199c8c223e-kube-api-access-5s2ks\") pod \"olm-operator-6b444d44fb-knm8v\" (UID: \"7d7fb490-d6a4-4bc8-980c-d0199c8c223e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.615381 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pph\" (UniqueName: \"kubernetes.io/projected/471e7f45-877a-4cba-8e27-b2a249dac74e-kube-api-access-45pph\") pod \"router-default-5444994796-wkrlj\" (UID: \"471e7f45-877a-4cba-8e27-b2a249dac74e\") " pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.634911 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qtwpt" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.636846 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2t2\" (UniqueName: \"kubernetes.io/projected/3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4-kube-api-access-6m2t2\") pod \"service-ca-operator-777779d784-tl8xz\" (UID: \"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.638987 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.649340 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.653164 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52xb7\" (UniqueName: \"kubernetes.io/projected/b325ec0f-cef2-4845-a8ba-d59a06dff2ee-kube-api-access-52xb7\") pod \"dns-operator-744455d44c-tqc8v\" (UID: \"b325ec0f-cef2-4845-a8ba-d59a06dff2ee\") " pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.657896 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.663487 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f7zpr"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.664558 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.668409 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.670133 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.170115089 +0000 UTC m=+153.239706330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.673314 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwhmb\" (UniqueName: \"kubernetes.io/projected/d803f9bf-03e8-4757-9bc2-94692dae48b6-kube-api-access-rwhmb\") pod \"machine-config-operator-74547568cd-77wqx\" (UID: \"d803f9bf-03e8-4757-9bc2-94692dae48b6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.673333 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.697443 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.700791 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vl4vz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.700813 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w29s\" (UniqueName: \"kubernetes.io/projected/52ff08f1-7920-48d3-ae45-54b18ea49dd2-kube-api-access-7w29s\") pod \"catalog-operator-68c6474976-8bxnv\" (UID: \"52ff08f1-7920-48d3-ae45-54b18ea49dd2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.701170 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xghqq"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.709225 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.772938 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.773602 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.273583427 +0000 UTC m=+153.343174668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.776032 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.779248 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.791053 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" Oct 10 06:53:45 crc kubenswrapper[4732]: W1010 06:53:45.798108 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45674fdf_b85c_4d66_afc3_b0fad73523da.slice/crio-b746360d90769c605f8739ee3caea71471a3d636dceb135a708c03c90a7a6e6b WatchSource:0}: Error finding container b746360d90769c605f8739ee3caea71471a3d636dceb135a708c03c90a7a6e6b: Status 404 returned error can't find the container with id b746360d90769c605f8739ee3caea71471a3d636dceb135a708c03c90a7a6e6b Oct 10 06:53:45 crc kubenswrapper[4732]: W1010 06:53:45.800261 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec63d28_86dc_4410_87a5_b7837f0d7070.slice/crio-9090b649910f8a7be6a0df54607fd085ee362b4f51c9680255051b02013f99f7 WatchSource:0}: Error finding container 9090b649910f8a7be6a0df54607fd085ee362b4f51c9680255051b02013f99f7: Status 404 returned error can't find the container with id 9090b649910f8a7be6a0df54607fd085ee362b4f51c9680255051b02013f99f7 Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.847958 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.850054 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-z2jfv"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.852924 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:45 crc kubenswrapper[4732]: W1010 06:53:45.867216 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b37b694_d6d2_4f1f_8515_ae9ecdbbda7d.slice/crio-bb2035f6b2f7dd717bd6eb1f1ecbaf84fb1302f76fc9626e23850488295e9b4f WatchSource:0}: Error finding container bb2035f6b2f7dd717bd6eb1f1ecbaf84fb1302f76fc9626e23850488295e9b4f: Status 404 returned error can't find the container with id bb2035f6b2f7dd717bd6eb1f1ecbaf84fb1302f76fc9626e23850488295e9b4f Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.874234 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.874647 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.37463568 +0000 UTC m=+153.444226921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.891332 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.937294 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-kg7gq"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.977841 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:45 crc kubenswrapper[4732]: E1010 06:53:45.978544 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.478530389 +0000 UTC m=+153.548121630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.992321 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm"] Oct 10 06:53:45 crc kubenswrapper[4732]: I1010 06:53:45.992370 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf"] Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.010096 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb"] Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.079229 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.079799 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.579768657 +0000 UTC m=+153.649359898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.118176 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5"] Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.188444 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts"] Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.194835 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.195350 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.695326343 +0000 UTC m=+153.764917594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.255682 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" podStartSLOduration=128.255647683 podStartE2EDuration="2m8.255647683s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:46.212186618 +0000 UTC m=+153.281777879" watchObservedRunningTime="2025-10-10 06:53:46.255647683 +0000 UTC m=+153.325238924" Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.293598 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vvpcw" podStartSLOduration=128.293578433 podStartE2EDuration="2m8.293578433s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:46.263034368 +0000 UTC m=+153.332625609" watchObservedRunningTime="2025-10-10 06:53:46.293578433 +0000 UTC m=+153.363169674" Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.297037 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.797017784 +0000 UTC m=+153.866609025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.296568 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.398601 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.399284 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:46.899268899 +0000 UTC m=+153.968860140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.502533 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.502974 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.002958742 +0000 UTC m=+154.072549983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.584422 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" event={"ID":"3e64e809-d579-480a-bfed-24473604cff0","Type":"ContainerStarted","Data":"3399fa8abdd8dc1df6a68da49a19845f3c4d413711b14eadedce4bf96b1b5531"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.603433 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.604098 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.104079487 +0000 UTC m=+154.173670728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.605934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" event={"ID":"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d","Type":"ContainerStarted","Data":"bb2035f6b2f7dd717bd6eb1f1ecbaf84fb1302f76fc9626e23850488295e9b4f"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.610327 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" podStartSLOduration=128.610308881 podStartE2EDuration="2m8.610308881s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:46.578825852 +0000 UTC m=+153.648417103" watchObservedRunningTime="2025-10-10 06:53:46.610308881 +0000 UTC m=+153.679900122" Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.617948 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" event={"ID":"c1f6a9a4-0043-442a-9f1a-6661546d2397","Type":"ContainerStarted","Data":"c21ea32c5c1bb9e9513f1fb2de9bdebfb7d0710a7a70dea2f577fcfb6ee854b2"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.632785 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" event={"ID":"99b7a779-8943-4774-b15e-959fa326d08d","Type":"ContainerStarted","Data":"31cae7969021dabd9aee1612ea28d0cbf92aab920c34f735d2f17c67aa71a396"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.642912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f7zpr" event={"ID":"45674fdf-b85c-4d66-afc3-b0fad73523da","Type":"ContainerStarted","Data":"b746360d90769c605f8739ee3caea71471a3d636dceb135a708c03c90a7a6e6b"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.644800 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v"] Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.682454 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79"] Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.704906 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.705291 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.205277735 +0000 UTC m=+154.274868986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.712382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" event={"ID":"209fb0dc-d6b6-476d-b160-6a0052080df5","Type":"ContainerStarted","Data":"631694b6aa962c1531f1ddfba8b5aa67f68d5ec4cb8d73650c8d0d7d076c2711"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.716163 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kg7gq" event={"ID":"e7a62711-6cb6-4867-a232-8b8b043faa74","Type":"ContainerStarted","Data":"bf5eda4ec642e56e7abf1be6d56f32c7819c21d02b880999b2cca11b594cdedc"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.731388 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" event={"ID":"ca006e78-0280-497a-9e0b-1c52edc29e45","Type":"ContainerStarted","Data":"6f1198ddac042bb54522ce846efd87fbd885fb50198ee44fb47dacf80b8fdacc"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.763528 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" event={"ID":"3c038cc2-3a8f-43e9-afc6-8b22acca9266","Type":"ContainerStarted","Data":"7994ef85b439824936731ce09929c76f2c3b688298dbc0f32c422ae4597ba65f"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.774334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" event={"ID":"76e4bae7-5083-477d-ac35-4ab579a104ba","Type":"ContainerStarted","Data":"1ac9e5c1806c0d2d951fe6104e6e0f45c0c4414e81e6ad772f716e5bad16a491"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.818446 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.819715 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.31967647 +0000 UTC m=+154.389267711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: W1010 06:53:46.830162 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7fb490_d6a4_4bc8_980c_d0199c8c223e.slice/crio-5d76455a55e5aefd1116cbf071c5e69066007cdc879f29fd617d2ef5edce5fe3 WatchSource:0}: Error finding container 5d76455a55e5aefd1116cbf071c5e69066007cdc879f29fd617d2ef5edce5fe3: Status 404 returned error can't find the container with id 5d76455a55e5aefd1116cbf071c5e69066007cdc879f29fd617d2ef5edce5fe3 Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.830764 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" event={"ID":"e5164ff6-32ed-4b15-a74a-bd7783dbceea","Type":"ContainerStarted","Data":"59cedcc60bcf7537e4dc039dee8598dcf1b586e14e435ad3e079b67382a583b3"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.852482 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" event={"ID":"dec63d28-86dc-4410-87a5-b7837f0d7070","Type":"ContainerStarted","Data":"9090b649910f8a7be6a0df54607fd085ee362b4f51c9680255051b02013f99f7"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.857190 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" event={"ID":"05b8f5ea-d132-4c18-ac52-00fbac36d987","Type":"ContainerStarted","Data":"efb8fcca4a54ef61d8b25ae646e85f11dddec05b539d8de076b70275ad899082"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.884790 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" event={"ID":"3f543214-fd2c-4083-b253-4f9cc914a10c","Type":"ContainerStarted","Data":"5efdb1bd7d68338e3d2d3d7f95e3e03151bf8743e9e09c9422e60a097c8a3e01"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.953472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:46 crc kubenswrapper[4732]: E1010 06:53:46.953874 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.453852887 +0000 UTC m=+154.523444128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:46 crc kubenswrapper[4732]: W1010 06:53:46.955584 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode769faf1_a610_4e5f_9dbd_cac1aaa1c1d0.slice/crio-dc7f911b01e55a65986303a2e1dc60dc736b77da94b1e0c9b09cf32994eea861 WatchSource:0}: Error finding container dc7f911b01e55a65986303a2e1dc60dc736b77da94b1e0c9b09cf32994eea861: Status 404 returned error can't find the container with id dc7f911b01e55a65986303a2e1dc60dc736b77da94b1e0c9b09cf32994eea861 Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.956003 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qtwpt" event={"ID":"8359b8ee-4138-4f5e-93e6-438c6c1aba4d","Type":"ContainerStarted","Data":"02f27e86546a2473124bbc0de5d952e6d47a283c7114497ead981f9411323af7"} Oct 10 06:53:46 crc kubenswrapper[4732]: I1010 06:53:46.979467 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ssm2l"] Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.016786 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vl4vz"] Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.033045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" event={"ID":"76679b84-27e7-4a6a-b904-f399c9b7eb8d","Type":"ContainerStarted","Data":"583b9d31e50ae9fbcc8a1d3b53bf53664a32c7f06a2c7058c4114e5f3ac48563"} Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.033921 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.052856 4732 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-l858p container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.052908 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" podUID="76679b84-27e7-4a6a-b904-f399c9b7eb8d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.053156 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" event={"ID":"46c511d3-25e3-422a-b0b2-099b14de9a01","Type":"ContainerStarted","Data":"a31b373bfd51d1727e27f7fa4ebc6d814c70d78ddc70a7551a8500408fd18a11"} Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.054562 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.055437 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.555422824 +0000 UTC m=+154.625014065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.057531 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wkrlj" event={"ID":"471e7f45-877a-4cba-8e27-b2a249dac74e","Type":"ContainerStarted","Data":"b60af5d1656c32dcd6ed87944dee8e72f3774b39abce08c1f7eba468f88e10d6"} Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.063802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" event={"ID":"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8","Type":"ContainerStarted","Data":"de7fc4c17976e1437134dbd0075db199af05ee34a9c514e97480675084e68b5b"} Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.159459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.167009 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.666994185 +0000 UTC m=+154.736585426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.194118 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9"] Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.200084 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r"] Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.264190 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.266346 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.766326023 +0000 UTC m=+154.835917264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.365410 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.365732 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.865720133 +0000 UTC m=+154.935311374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.469774 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.470027 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:47.970008472 +0000 UTC m=+155.039599713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.471843 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4cfw4" podStartSLOduration=130.47182932 podStartE2EDuration="2m10.47182932s" podCreationTimestamp="2025-10-10 06:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:47.469750055 +0000 UTC m=+154.539341296" watchObservedRunningTime="2025-10-10 06:53:47.47182932 +0000 UTC m=+154.541420551" Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.573380 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.573840 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.073822448 +0000 UTC m=+155.143413689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.580256 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fdshh" podStartSLOduration=129.580239827 podStartE2EDuration="2m9.580239827s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:47.579153099 +0000 UTC m=+154.648744360" watchObservedRunningTime="2025-10-10 06:53:47.580239827 +0000 UTC m=+154.649831078" Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.583063 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v6gpp" podStartSLOduration=129.583047981 podStartE2EDuration="2m9.583047981s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:47.547553596 +0000 UTC m=+154.617144837" watchObservedRunningTime="2025-10-10 06:53:47.583047981 +0000 UTC m=+154.652639232" Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.601320 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" podStartSLOduration=129.601296782 podStartE2EDuration="2m9.601296782s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:47.600593404 +0000 UTC m=+154.670184665" watchObservedRunningTime="2025-10-10 06:53:47.601296782 +0000 UTC m=+154.670888023" Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.676294 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.676591 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.176572756 +0000 UTC m=+155.246163997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: W1010 06:53:47.719175 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6dc879_6002_4c56_971c_18d3d9d311a6.slice/crio-a71cf750e0c7a26dcf511c6d4fb66da24b5ca76361887f64f7a358234971aed0 WatchSource:0}: Error finding container a71cf750e0c7a26dcf511c6d4fb66da24b5ca76361887f64f7a358234971aed0: Status 404 returned error can't find the container with id a71cf750e0c7a26dcf511c6d4fb66da24b5ca76361887f64f7a358234971aed0 Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.778328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.778679 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.278666207 +0000 UTC m=+155.348257448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.873012 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.886054 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.886178 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.386160231 +0000 UTC m=+155.455751472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.886352 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.886601 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.386584622 +0000 UTC m=+155.456175863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:47 crc kubenswrapper[4732]: I1010 06:53:47.992143 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:47 crc kubenswrapper[4732]: E1010 06:53:47.992521 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.492501704 +0000 UTC m=+155.562092945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.081048 4732 generic.go:334] "Generic (PLEG): container finished" podID="3c038cc2-3a8f-43e9-afc6-8b22acca9266" containerID="453ff3d33eee07b8d936cc98f8d9ab533285f54a19354f4bdc5086e8349368c6" exitCode=0 Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.081367 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" event={"ID":"3c038cc2-3a8f-43e9-afc6-8b22acca9266","Type":"ContainerDied","Data":"453ff3d33eee07b8d936cc98f8d9ab533285f54a19354f4bdc5086e8349368c6"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.096632 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.097307 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.597286955 +0000 UTC m=+155.666878196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.100073 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f7zpr" event={"ID":"45674fdf-b85c-4d66-afc3-b0fad73523da","Type":"ContainerStarted","Data":"208fd0486a36252d8e7c408d7ef4ecbacd2c61a5fd46dc4f0c3aa82019b8c4a8"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.102176 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-f7zpr" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.139661 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" event={"ID":"7d7fb490-d6a4-4bc8-980c-d0199c8c223e","Type":"ContainerStarted","Data":"5d76455a55e5aefd1116cbf071c5e69066007cdc879f29fd617d2ef5edce5fe3"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.143980 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7zpr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.144038 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f7zpr" podUID="45674fdf-b85c-4d66-afc3-b0fad73523da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.155596 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" event={"ID":"3f543214-fd2c-4083-b253-4f9cc914a10c","Type":"ContainerStarted","Data":"f660d028c0ba0a56d90a080bb2b58d5f5278c3edca6848e03b7b0da46d12c306"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.157491 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-f7zpr" podStartSLOduration=130.157474221 podStartE2EDuration="2m10.157474221s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:48.152840489 +0000 UTC m=+155.222431760" watchObservedRunningTime="2025-10-10 06:53:48.157474221 +0000 UTC m=+155.227065462" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.168890 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" event={"ID":"e5164ff6-32ed-4b15-a74a-bd7783dbceea","Type":"ContainerStarted","Data":"6c307ce89e666964cfa669e2e52a394495468def0a2832b16ec0be1795c92f6a"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.169197 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.173461 4732 patch_prober.go:28] interesting pod/console-operator-58897d9998-bbf6g container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.173515 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" podUID="e5164ff6-32ed-4b15-a74a-bd7783dbceea" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.179312 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vl4vz" event={"ID":"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683","Type":"ContainerStarted","Data":"f6a3474f03282b7f09af5f6509ec6d273f6efca0305fcc21fa4c8bb6c880e4d8"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.187184 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qtwpt" event={"ID":"8359b8ee-4138-4f5e-93e6-438c6c1aba4d","Type":"ContainerStarted","Data":"4aa08e3eee577119ed2832c506ab4c4431f9fd06676cb29083d32822d0a5b19b"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.195631 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" event={"ID":"8f6dc879-6002-4c56-971c-18d3d9d311a6","Type":"ContainerStarted","Data":"a71cf750e0c7a26dcf511c6d4fb66da24b5ca76361887f64f7a358234971aed0"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.201623 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.202478 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qtwpt" podStartSLOduration=6.202465607 podStartE2EDuration="6.202465607s" podCreationTimestamp="2025-10-10 06:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:48.201147442 +0000 UTC m=+155.270738683" watchObservedRunningTime="2025-10-10 06:53:48.202465607 +0000 UTC m=+155.272056848" Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.202654 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.702629971 +0000 UTC m=+155.772221212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.203291 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" podStartSLOduration=130.203286049 podStartE2EDuration="2m10.203286049s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:48.186455945 +0000 UTC m=+155.256047196" watchObservedRunningTime="2025-10-10 06:53:48.203286049 +0000 UTC m=+155.272877290" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.205893 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" event={"ID":"f4c17a98-44ae-4995-9bcc-c398f3b9a476","Type":"ContainerStarted","Data":"3c3b47644c323d3bf343edc823024a173cee419102fd186e8687e7a11c273232"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.214863 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ssm2l" event={"ID":"4d3e2cf8-080a-42b5-8c26-1336d0279fd4","Type":"ContainerStarted","Data":"10cd3d8927531322989629caca7af7fad83262168426b3c1e3ed4c5c3bcce26f"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.218381 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kg7gq" event={"ID":"e7a62711-6cb6-4867-a232-8b8b043faa74","Type":"ContainerStarted","Data":"978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.233826 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" event={"ID":"dec63d28-86dc-4410-87a5-b7837f0d7070","Type":"ContainerStarted","Data":"43eb79b0f33bd5a7ba5cbd9965546432dad37f58abf3bff4a11b749af9495aaa"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.236063 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" event={"ID":"76679b84-27e7-4a6a-b904-f399c9b7eb8d","Type":"ContainerStarted","Data":"adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.247039 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" event={"ID":"76e4bae7-5083-477d-ac35-4ab579a104ba","Type":"ContainerStarted","Data":"c20383127e3250c0fc7ad6bb49927db5918e7898b10684cba3946b15ea14a473"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.248830 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-kg7gq" podStartSLOduration=130.248810039 podStartE2EDuration="2m10.248810039s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:48.24391924 +0000 UTC m=+155.313510491" watchObservedRunningTime="2025-10-10 06:53:48.248810039 +0000 UTC m=+155.318401280" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.259604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" event={"ID":"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0","Type":"ContainerStarted","Data":"dc7f911b01e55a65986303a2e1dc60dc736b77da94b1e0c9b09cf32994eea861"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.271540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" event={"ID":"3e64e809-d579-480a-bfed-24473604cff0","Type":"ContainerStarted","Data":"fef13847c0998f8b5e88eefecd43c4a69b7ecd8cf31457e8c9ede6e2180ee77f"} Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.303190 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.308212 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.808194944 +0000 UTC m=+155.877786255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.326297 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wv8n4" podStartSLOduration=130.326279621 podStartE2EDuration="2m10.326279621s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:48.325203682 +0000 UTC m=+155.394794923" watchObservedRunningTime="2025-10-10 06:53:48.326279621 +0000 UTC m=+155.395870862" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.367475 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" podStartSLOduration=130.367456436 podStartE2EDuration="2m10.367456436s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:48.366533652 +0000 UTC m=+155.436124903" watchObservedRunningTime="2025-10-10 06:53:48.367456436 +0000 UTC m=+155.437047677" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.390991 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.407873 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.408201 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.908168109 +0000 UTC m=+155.977759350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.408334 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.408757 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:48.908743394 +0000 UTC m=+155.978334635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.489153 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.509437 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.510064 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.010048444 +0000 UTC m=+156.079639685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.538911 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.554521 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.568639 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.612754 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.613088 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.11307303 +0000 UTC m=+156.182664271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.714019 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.714163 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.214131474 +0000 UTC m=+156.283722715 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.714598 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.715122 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.21511465 +0000 UTC m=+156.284705891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.748796 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.761778 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.766551 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.788250 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.797030 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g2xxs"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.811802 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-78wpk"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.814451 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f7phv"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.816797 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.817182 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.31716644 +0000 UTC m=+156.386757681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.827893 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-gsx4x"] Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.847326 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tqc8v"] Oct 10 06:53:48 crc kubenswrapper[4732]: W1010 06:53:48.858033 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd803f9bf_03e8_4757_9bc2_94692dae48b6.slice/crio-ceb0b1a5d11ac3bda80f0fa5a4d452765cf80919d8cb192eeaf3a7d4a8feeeea WatchSource:0}: Error finding container ceb0b1a5d11ac3bda80f0fa5a4d452765cf80919d8cb192eeaf3a7d4a8feeeea: Status 404 returned error can't find the container with id ceb0b1a5d11ac3bda80f0fa5a4d452765cf80919d8cb192eeaf3a7d4a8feeeea Oct 10 06:53:48 crc kubenswrapper[4732]: W1010 06:53:48.863818 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebb0d91_a16e_4af3_ac63_8c1142e6bfac.slice/crio-a6fc517eca334609b04441c4f692e12c4bffa2243e48bda4741d25eb013889dc WatchSource:0}: Error finding container a6fc517eca334609b04441c4f692e12c4bffa2243e48bda4741d25eb013889dc: Status 404 returned error can't find the container with id a6fc517eca334609b04441c4f692e12c4bffa2243e48bda4741d25eb013889dc Oct 10 06:53:48 crc kubenswrapper[4732]: W1010 06:53:48.882013 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5830df43_dc59_4583_8492_fddb895a4266.slice/crio-d0c19ec5bce7382fac3ed4e6111ac0b3d2938353bd4f9801630224fcf1f08bbe WatchSource:0}: Error finding container d0c19ec5bce7382fac3ed4e6111ac0b3d2938353bd4f9801630224fcf1f08bbe: Status 404 returned error can't find the container with id d0c19ec5bce7382fac3ed4e6111ac0b3d2938353bd4f9801630224fcf1f08bbe Oct 10 06:53:48 crc kubenswrapper[4732]: I1010 06:53:48.919342 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:48 crc kubenswrapper[4732]: E1010 06:53:48.919864 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.419852236 +0000 UTC m=+156.489443477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.020256 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.020708 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.520667354 +0000 UTC m=+156.590258595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.122151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.122723 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.622708963 +0000 UTC m=+156.692300204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.228747 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.229424 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.729397756 +0000 UTC m=+156.798988997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.231405 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.231730 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.731721277 +0000 UTC m=+156.801312508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.324095 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" event={"ID":"c1f6a9a4-0043-442a-9f1a-6661546d2397","Type":"ContainerStarted","Data":"1204b65f6c06d153139e17857cbfa43026282cab685b807a95ed083c223e53a8"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.331028 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" event={"ID":"3c038cc2-3a8f-43e9-afc6-8b22acca9266","Type":"ContainerStarted","Data":"31a9544029aedfa6fe2202b25830134372cd85fa4623433e89e71b73a9011881"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.332078 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.332443 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.832426821 +0000 UTC m=+156.902018062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.357071 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" podStartSLOduration=131.35704957 podStartE2EDuration="2m11.35704957s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.356390583 +0000 UTC m=+156.425981824" watchObservedRunningTime="2025-10-10 06:53:49.35704957 +0000 UTC m=+156.426640811" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.363374 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" event={"ID":"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4","Type":"ContainerStarted","Data":"dbbdb378408f58c0c28c411aef8d97b581006f03c9522564eb9ac356ec18b4b4"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.363420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" event={"ID":"3fb01cd2-b7ee-4adc-8488-29d19c6d4ac4","Type":"ContainerStarted","Data":"bc6cb6764fbc818e83d54bba5e73ade975f2c1424d1058f5c8374c67c996927e"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.379776 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" event={"ID":"05b8f5ea-d132-4c18-ac52-00fbac36d987","Type":"ContainerStarted","Data":"6de8c41925d07f4ff7229ce7b6b535136757fb0a272285b0fe2410ef4bc7fbad"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.395667 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" podStartSLOduration=131.395639927 podStartE2EDuration="2m11.395639927s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.389856135 +0000 UTC m=+156.459447396" watchObservedRunningTime="2025-10-10 06:53:49.395639927 +0000 UTC m=+156.465231198" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.399786 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wkrlj" event={"ID":"471e7f45-877a-4cba-8e27-b2a249dac74e","Type":"ContainerStarted","Data":"fdda74219a80d579c742f069b2da024772421fd1c73b95847b97448cf804c9f4"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.414820 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tl8xz" podStartSLOduration=131.414801982 podStartE2EDuration="2m11.414801982s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.412296636 +0000 UTC m=+156.481887897" watchObservedRunningTime="2025-10-10 06:53:49.414801982 +0000 UTC m=+156.484393233" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.436469 4732 generic.go:334] "Generic (PLEG): container finished" podID="9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8" containerID="e3e99032a458725aaae8e10b8eb39763e9e3a188526b1a389b2929b9a50267f6" exitCode=0 Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.436648 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" event={"ID":"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8","Type":"ContainerDied","Data":"e3e99032a458725aaae8e10b8eb39763e9e3a188526b1a389b2929b9a50267f6"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.440651 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.442609 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:49.942594515 +0000 UTC m=+157.012185746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.452303 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-4jwhf" podStartSLOduration=131.45228623 podStartE2EDuration="2m11.45228623s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.452110536 +0000 UTC m=+156.521701797" watchObservedRunningTime="2025-10-10 06:53:49.45228623 +0000 UTC m=+156.521877461" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.499066 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" event={"ID":"84e57a66-98b6-44af-87b4-d3fcf39fa72b","Type":"ContainerStarted","Data":"0131a043c487bf33141b01290d302142cca611bb725e841014f18cf10875340e"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.499104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" event={"ID":"84e57a66-98b6-44af-87b4-d3fcf39fa72b","Type":"ContainerStarted","Data":"05b00671748892a3fb3c6f2c5e04b51da91b60454da12bf3881b7d84e7dc3205"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.525661 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wkrlj" podStartSLOduration=131.525642314 podStartE2EDuration="2m11.525642314s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.522435099 +0000 UTC m=+156.592026360" watchObservedRunningTime="2025-10-10 06:53:49.525642314 +0000 UTC m=+156.595233565" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.534965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" event={"ID":"e9043fa5-3781-44a6-ab0d-1e5f98755081","Type":"ContainerStarted","Data":"66236cc1dbca22e85886609a5ca297b42895592405402470870a3e323cacd8a5"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.544129 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.544783 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.044749578 +0000 UTC m=+157.114340819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.546665 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.547022 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.047013867 +0000 UTC m=+157.116605108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.564457 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" podStartSLOduration=131.564437647 podStartE2EDuration="2m11.564437647s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.564301023 +0000 UTC m=+156.633892274" watchObservedRunningTime="2025-10-10 06:53:49.564437647 +0000 UTC m=+156.634028888" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.612965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" event={"ID":"3e64e809-d579-480a-bfed-24473604cff0","Type":"ContainerStarted","Data":"010dbdba40fbe0c4d9fed1b0d462a9eef120bf4c7b8c654af7ef0e06af3cde25"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.645896 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-z2jfv" podStartSLOduration=131.645878073 podStartE2EDuration="2m11.645878073s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.645858813 +0000 UTC m=+156.715450084" watchObservedRunningTime="2025-10-10 06:53:49.645878073 +0000 UTC m=+156.715469314" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.648435 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.650188 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.150171346 +0000 UTC m=+157.219762587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.759859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.760136 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.260123235 +0000 UTC m=+157.329714476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.766022 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" event={"ID":"f4c17a98-44ae-4995-9bcc-c398f3b9a476","Type":"ContainerStarted","Data":"4ddb0d5d7e7f80b347323ea340ddd470ff1eed567ada07081f88cae3bb59b18e"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.766084 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" event={"ID":"f4c17a98-44ae-4995-9bcc-c398f3b9a476","Type":"ContainerStarted","Data":"ec7372d205acb8a75994f5ea0bcde9bb8aba11fa095f81a283b19fcb083950aa"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.780327 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.804653 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:49 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:49 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:49 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.805023 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.806437 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lhkx9" podStartSLOduration=131.806426585 podStartE2EDuration="2m11.806426585s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.806180469 +0000 UTC m=+156.875771710" watchObservedRunningTime="2025-10-10 06:53:49.806426585 +0000 UTC m=+156.876017826" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.819027 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" event={"ID":"7d7fb490-d6a4-4bc8-980c-d0199c8c223e","Type":"ContainerStarted","Data":"6b46568e86f4e80d9050506e7c86896eac5860a83888b489e9b4b7310351489e"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.820182 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.840975 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" podStartSLOduration=131.840960755 podStartE2EDuration="2m11.840960755s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.839954479 +0000 UTC m=+156.909545730" watchObservedRunningTime="2025-10-10 06:53:49.840960755 +0000 UTC m=+156.910551996" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.841331 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-knm8v" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.844099 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" event={"ID":"b325ec0f-cef2-4845-a8ba-d59a06dff2ee","Type":"ContainerStarted","Data":"497da77e55fb7e5e1b3c757f6507a62a604e0d93e5dcbd1b53ce54ab417a7dcd"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.861268 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.862402 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.3623853 +0000 UTC m=+157.431976541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.876971 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" event={"ID":"658be8cb-31b1-4b7f-a96b-e23d029a5365","Type":"ContainerStarted","Data":"3f64def9800a5d922a50aa2a1128cadc6321d5c28426d630a4959d335d58943c"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.924045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" event={"ID":"d803f9bf-03e8-4757-9bc2-94692dae48b6","Type":"ContainerStarted","Data":"ceb0b1a5d11ac3bda80f0fa5a4d452765cf80919d8cb192eeaf3a7d4a8feeeea"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.947867 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" event={"ID":"8f6dc879-6002-4c56-971c-18d3d9d311a6","Type":"ContainerStarted","Data":"c6dc479810ca8b92b49a96ad6e874fde9695a1e960c7e91021010eb7cb698d6e"} Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.948997 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.962812 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:49 crc kubenswrapper[4732]: E1010 06:53:49.966333 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.466316499 +0000 UTC m=+157.535907740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:49 crc kubenswrapper[4732]: I1010 06:53:49.984788 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" podStartSLOduration=131.984770016 podStartE2EDuration="2m11.984770016s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:49.983188574 +0000 UTC m=+157.052779835" watchObservedRunningTime="2025-10-10 06:53:49.984770016 +0000 UTC m=+157.054361257" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.005886 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" event={"ID":"ca006e78-0280-497a-9e0b-1c52edc29e45","Type":"ContainerStarted","Data":"740a7bc85236a72e812b907755fbae44fc787dcbf68acd18f0d5c3da94dc9084"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.033059 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.033430 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.042289 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" event={"ID":"3b37b694-d6d2-4f1f-8515-ae9ecdbbda7d","Type":"ContainerStarted","Data":"d11aa749bb452e1306783940c5b19b8a0b55cc6f566a2d71e684cf5d34774b56"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.067543 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" podStartSLOduration=132.067521907 podStartE2EDuration="2m12.067521907s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.064390915 +0000 UTC m=+157.133982176" watchObservedRunningTime="2025-10-10 06:53:50.067521907 +0000 UTC m=+157.137113148" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.074749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.076092 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.576074722 +0000 UTC m=+157.645665953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.082204 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ssm2l" event={"ID":"4d3e2cf8-080a-42b5-8c26-1336d0279fd4","Type":"ContainerStarted","Data":"3b718e7cb5899cf0cd653c1c93e046641ac697bbb5299f0c850bf5b8136bb9c2"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.082529 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-ssm2l" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.093589 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" event={"ID":"466cd789-08e1-413b-a590-8c3e3fb3cb40","Type":"ContainerStarted","Data":"002bb15673e81348ab81624d532929fa38ec352868b5c01a7376454406238765"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.103846 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" event={"ID":"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b","Type":"ContainerStarted","Data":"e968f791477e471c5da8b057da9f447aaaec0f1bea27793031d20f051be213e5"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.105265 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" event={"ID":"6bf9fc72-a5eb-4b91-b0e9-95b6782d7d6b","Type":"ContainerStarted","Data":"6176886b429283676a74a473519cb3e2875eacf708f709970bf32f0a20d6c2aa"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.108223 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" event={"ID":"e769faf1-a610-4e5f-9dbd-cac1aaa1c1d0","Type":"ContainerStarted","Data":"d2193a465a669061b09b2a70a774580d22b727a516c8d57ba31951580425b827"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.109404 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" event={"ID":"809f770a-ef33-46b3-b29c-8b98fb0440fe","Type":"ContainerStarted","Data":"bd2b4a720a7802ebcb914b50707e2d50d325581af8bdc8c63ac18330c1394cae"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.137168 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-96qlm" podStartSLOduration=132.137147782 podStartE2EDuration="2m12.137147782s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.133540397 +0000 UTC m=+157.203131658" watchObservedRunningTime="2025-10-10 06:53:50.137147782 +0000 UTC m=+157.206739023" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.147569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" event={"ID":"3f543214-fd2c-4083-b253-4f9cc914a10c","Type":"ContainerStarted","Data":"3daa654550484224f4f74d6e10ed0b373d1958ab58256408df9cac8bbd314c7d"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.151349 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" event={"ID":"bebb0d91-a16e-4af3-ac63-8c1142e6bfac","Type":"ContainerStarted","Data":"a6fc517eca334609b04441c4f692e12c4bffa2243e48bda4741d25eb013889dc"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.152221 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.153820 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" event={"ID":"52ff08f1-7920-48d3-ae45-54b18ea49dd2","Type":"ContainerStarted","Data":"0b1c5c1e4bc0fa95c7789b9d2914e2a85cb65ac37efd58c7f6b74405ccc22bba"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.153871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" event={"ID":"52ff08f1-7920-48d3-ae45-54b18ea49dd2","Type":"ContainerStarted","Data":"e41d119c16f293951f41e292f5d816d6e1e79eb40ad2a14bb5677983d41466a9"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.154251 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.155861 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" event={"ID":"209fb0dc-d6b6-476d-b160-6a0052080df5","Type":"ContainerStarted","Data":"a5e8cca616f5b3830b045d27987ee362d9493412d7662a78b70e1a40adec7567"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.162161 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" event={"ID":"5830df43-dc59-4583-8492-fddb895a4266","Type":"ContainerStarted","Data":"d0c19ec5bce7382fac3ed4e6111ac0b3d2938353bd4f9801630224fcf1f08bbe"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.165737 4732 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-8bxnv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.165796 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" podUID="52ff08f1-7920-48d3-ae45-54b18ea49dd2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.165840 4732 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-g2xxs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.165887 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" podUID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.170866 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vl4vz" event={"ID":"2e5ae2db-8e0c-4f4e-bef0-9798a6c09683","Type":"ContainerStarted","Data":"47413926dc9385b042e22749e7c4ae0d44331cfaffd1cdfa0c4185a2163989bf"} Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.172071 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7zpr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.172174 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f7zpr" podUID="45674fdf-b85c-4d66-afc3-b0fad73523da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.176952 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.180324 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.68030467 +0000 UTC m=+157.749896001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.202987 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-bbf6g" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.237222 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xghqq" podStartSLOduration=132.23720752 podStartE2EDuration="2m12.23720752s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.2360838 +0000 UTC m=+157.305675061" watchObservedRunningTime="2025-10-10 06:53:50.23720752 +0000 UTC m=+157.306798761" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.281386 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tmbsf" podStartSLOduration=132.281369474 podStartE2EDuration="2m12.281369474s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.28083294 +0000 UTC m=+157.350424181" watchObservedRunningTime="2025-10-10 06:53:50.281369474 +0000 UTC m=+157.350960715" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.285528 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.289569 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.789545569 +0000 UTC m=+157.859136810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.365834 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vl4vz" podStartSLOduration=8.36582014 podStartE2EDuration="8.36582014s" podCreationTimestamp="2025-10-10 06:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.365609754 +0000 UTC m=+157.435201005" watchObservedRunningTime="2025-10-10 06:53:50.36582014 +0000 UTC m=+157.435411381" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.387608 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.387990 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.887976334 +0000 UTC m=+157.957567575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.485867 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-nw2jr" podStartSLOduration=132.485836553 podStartE2EDuration="2m12.485836553s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.48493806 +0000 UTC m=+157.554529311" watchObservedRunningTime="2025-10-10 06:53:50.485836553 +0000 UTC m=+157.555427794" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.488617 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.488757 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.988730539 +0000 UTC m=+158.058321780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.489244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.490118 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:50.990103936 +0000 UTC m=+158.059695177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.541132 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" podStartSLOduration=132.54110745 podStartE2EDuration="2m12.54110745s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.518286289 +0000 UTC m=+157.587877520" watchObservedRunningTime="2025-10-10 06:53:50.54110745 +0000 UTC m=+157.610698691" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.541740 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lnn79" podStartSLOduration=132.541734207 podStartE2EDuration="2m12.541734207s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.540874604 +0000 UTC m=+157.610465865" watchObservedRunningTime="2025-10-10 06:53:50.541734207 +0000 UTC m=+157.611325448" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.599828 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.600261 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.100227708 +0000 UTC m=+158.169818949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.628660 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dq6ts" podStartSLOduration=132.628641287 podStartE2EDuration="2m12.628641287s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.570085744 +0000 UTC m=+157.639676995" watchObservedRunningTime="2025-10-10 06:53:50.628641287 +0000 UTC m=+157.698232528" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.656984 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" podStartSLOduration=132.656943543 podStartE2EDuration="2m12.656943543s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.625610847 +0000 UTC m=+157.695202098" watchObservedRunningTime="2025-10-10 06:53:50.656943543 +0000 UTC m=+157.726534784" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.692519 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ssm2l" podStartSLOduration=8.69249697 podStartE2EDuration="8.69249697s" podCreationTimestamp="2025-10-10 06:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.667347238 +0000 UTC m=+157.736938489" watchObservedRunningTime="2025-10-10 06:53:50.69249697 +0000 UTC m=+157.762088201" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.701284 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.701612 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.20159952 +0000 UTC m=+158.271190761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.711561 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" podStartSLOduration=132.711545022 podStartE2EDuration="2m12.711545022s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:50.711015169 +0000 UTC m=+157.780606420" watchObservedRunningTime="2025-10-10 06:53:50.711545022 +0000 UTC m=+157.781136263" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.783578 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:50 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:50 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:50 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.783933 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.802865 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.803119 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.303080865 +0000 UTC m=+158.372672106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.803368 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.803729 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.303721562 +0000 UTC m=+158.373312803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.905269 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:50 crc kubenswrapper[4732]: E1010 06:53:50.905575 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.405559396 +0000 UTC m=+158.475150627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.950029 4732 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vrj7r container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.950125 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" podUID="8f6dc879-6002-4c56-971c-18d3d9d311a6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 06:53:50 crc kubenswrapper[4732]: I1010 06:53:50.994649 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.006827 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.007166 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.507155494 +0000 UTC m=+158.576746735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.108127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.108391 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.608356722 +0000 UTC m=+158.677947963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.176937 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" event={"ID":"9fb5f9ce-a7c7-4491-bb8b-db52b5f615d8","Type":"ContainerStarted","Data":"eb03ee8897bb3d44da96a2f6c62a518a9fa95a3562699f61e95cca89a9a1869e"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.177553 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.179153 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" event={"ID":"466cd789-08e1-413b-a590-8c3e3fb3cb40","Type":"ContainerStarted","Data":"619fb293853b4d9a8d859f3dfd70bb75139eb74f4fd060777fc1020b2cde29aa"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.179205 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" event={"ID":"466cd789-08e1-413b-a590-8c3e3fb3cb40","Type":"ContainerStarted","Data":"9e08ae79ac4db43b723d4729acfe7d1d58c7eed4233431ec29e0ecb921b0923d"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.179245 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.180631 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" event={"ID":"5830df43-dc59-4583-8492-fddb895a4266","Type":"ContainerStarted","Data":"bfaa549dbabbec117b87c06d627116fa26629a0b77ebfba7df9c70fcf697e0e2"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.182121 4732 generic.go:334] "Generic (PLEG): container finished" podID="76e4bae7-5083-477d-ac35-4ab579a104ba" containerID="c20383127e3250c0fc7ad6bb49927db5918e7898b10684cba3946b15ea14a473" exitCode=0 Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.182195 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" event={"ID":"76e4bae7-5083-477d-ac35-4ab579a104ba","Type":"ContainerDied","Data":"c20383127e3250c0fc7ad6bb49927db5918e7898b10684cba3946b15ea14a473"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.184377 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" event={"ID":"658be8cb-31b1-4b7f-a96b-e23d029a5365","Type":"ContainerStarted","Data":"e921f8a5ca7d6bab86366693ce02c0fec97d4785e81d1f0a4f71597f5d6d52bf"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.184402 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" event={"ID":"658be8cb-31b1-4b7f-a96b-e23d029a5365","Type":"ContainerStarted","Data":"41573d857433709c7ffb953ff21d54cdcf7a539280f8d182a7f0d17124b4a9db"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.186594 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" event={"ID":"d803f9bf-03e8-4757-9bc2-94692dae48b6","Type":"ContainerStarted","Data":"ff11161ddc61525b70d492f045c9f18c255c3c0756102cb3c05909304c07bfea"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.186648 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-77wqx" event={"ID":"d803f9bf-03e8-4757-9bc2-94692dae48b6","Type":"ContainerStarted","Data":"e04412ea3d91ca0ddaa9f25539146441e1821c0fc9e13fd0420d4a6c798121e5"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.188403 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" event={"ID":"84e57a66-98b6-44af-87b4-d3fcf39fa72b","Type":"ContainerStarted","Data":"d7c03c06542388e05b1f0925b7dcb2dfe5e7c6e0d22ad0850fed643f4a442321"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.189807 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2q7sx" event={"ID":"e9043fa5-3781-44a6-ab0d-1e5f98755081","Type":"ContainerStarted","Data":"71d66292cd35d23e63559b51e1ff10ab0f6217443fe02ff85ae870068a4929c5"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.191154 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-78wpk" event={"ID":"809f770a-ef33-46b3-b29c-8b98fb0440fe","Type":"ContainerStarted","Data":"1aba6d6ea250483ba88d5a78252050bc90a3c1b4b4f0b3e54816e0d602be95d8"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.192654 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" event={"ID":"b325ec0f-cef2-4845-a8ba-d59a06dff2ee","Type":"ContainerStarted","Data":"25f22d73abf53efea4c9e4237856ed6287e3a8e5c513d95e394dd27e30543419"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.192684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" event={"ID":"b325ec0f-cef2-4845-a8ba-d59a06dff2ee","Type":"ContainerStarted","Data":"6dfab57974d61f07dd2d3c9d2d17e5429f98b60a96085f1c759844f267bd2734"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.194772 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ssm2l" event={"ID":"4d3e2cf8-080a-42b5-8c26-1336d0279fd4","Type":"ContainerStarted","Data":"c8c03444fc0c84200aee32210f46b6572dcb0f2b094850a63a9b36fa2699189e"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.196360 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" event={"ID":"bebb0d91-a16e-4af3-ac63-8c1142e6bfac","Type":"ContainerStarted","Data":"4164ebb704d4c8bc353411a720e1d685a7e31d63d2f35d3337d14ca5170fa6ec"} Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.197357 4732 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-g2xxs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.197407 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" podUID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.201529 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-8bxnv" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.209006 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zbqvm" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.209086 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.209379 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.709366684 +0000 UTC m=+158.778957925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.212534 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" podStartSLOduration=133.212514937 podStartE2EDuration="2m13.212514937s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:51.210750751 +0000 UTC m=+158.280342012" watchObservedRunningTime="2025-10-10 06:53:51.212514937 +0000 UTC m=+158.282106178" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.271593 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-f7phv" podStartSLOduration=133.271571324 podStartE2EDuration="2m13.271571324s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:51.24295329 +0000 UTC m=+158.312544551" watchObservedRunningTime="2025-10-10 06:53:51.271571324 +0000 UTC m=+158.341162565" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.272660 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" podStartSLOduration=133.272652702 podStartE2EDuration="2m13.272652702s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:51.272223691 +0000 UTC m=+158.341814942" watchObservedRunningTime="2025-10-10 06:53:51.272652702 +0000 UTC m=+158.342243953" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.310143 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.310565 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.810534581 +0000 UTC m=+158.880125822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.324742 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.325073 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.825058514 +0000 UTC m=+158.894649755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.422652 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jh5dr" podStartSLOduration=133.422631236 podStartE2EDuration="2m13.422631236s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:51.393753385 +0000 UTC m=+158.463344636" watchObservedRunningTime="2025-10-10 06:53:51.422631236 +0000 UTC m=+158.492222477" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.425268 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.425512 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:51.925501321 +0000 UTC m=+158.995092562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.500605 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-tqc8v" podStartSLOduration=133.500590651 podStartE2EDuration="2m13.500590651s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:51.498096345 +0000 UTC m=+158.567687606" watchObservedRunningTime="2025-10-10 06:53:51.500590651 +0000 UTC m=+158.570181882" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.544369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.544629 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vrj7r" Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.544927 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.044912769 +0000 UTC m=+159.114504010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.645196 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.645470 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.145452618 +0000 UTC m=+159.215043859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.746498 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.746896 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.246881021 +0000 UTC m=+159.316472262 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.795586 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:51 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:51 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:51 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.795644 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.848110 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.848581 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.348558931 +0000 UTC m=+159.418150172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:51 crc kubenswrapper[4732]: I1010 06:53:51.949239 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:51 crc kubenswrapper[4732]: E1010 06:53:51.949789 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.449768339 +0000 UTC m=+159.519359640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.050477 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.050656 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.550633968 +0000 UTC m=+159.620225209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.050795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.051129 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.551117761 +0000 UTC m=+159.620709002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.152037 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.152225 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.652195085 +0000 UTC m=+159.721786326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.152466 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.152795 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.65277847 +0000 UTC m=+159.722369711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.204097 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" event={"ID":"5830df43-dc59-4583-8492-fddb895a4266","Type":"ContainerStarted","Data":"34837f672682ad0896813a586a6eafb494cf98c41621895b6039a53f16b2d11d"} Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.209833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.223030 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gtdln"] Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.223992 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.235124 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.242346 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtdln"] Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.259249 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.261873 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.761845985 +0000 UTC m=+159.831437306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.262377 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-catalog-content\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.262577 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brq5m\" (UniqueName: \"kubernetes.io/projected/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-kube-api-access-brq5m\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.262645 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-utilities\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.262976 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.268468 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.768453749 +0000 UTC m=+159.838044990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.366043 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.366524 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-catalog-content\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.366563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brq5m\" (UniqueName: \"kubernetes.io/projected/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-kube-api-access-brq5m\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.366580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-utilities\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.366999 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-utilities\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.367068 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.867050488 +0000 UTC m=+159.936641729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.367263 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-catalog-content\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.394903 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brq5m\" (UniqueName: \"kubernetes.io/projected/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-kube-api-access-brq5m\") pod \"certified-operators-gtdln\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.426307 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zzmww"] Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.427386 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.432375 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.436251 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzmww"] Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.473325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-catalog-content\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.473369 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-874j8\" (UniqueName: \"kubernetes.io/projected/7c8aeb7a-213a-4677-9877-69a57de9d13a-kube-api-access-874j8\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.473409 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-utilities\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.473451 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.473779 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:52.973764521 +0000 UTC m=+160.043355762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.494258 4732 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.535298 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.538600 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.573994 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76e4bae7-5083-477d-ac35-4ab579a104ba-secret-volume\") pod \"76e4bae7-5083-477d-ac35-4ab579a104ba\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.574123 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkqjw\" (UniqueName: \"kubernetes.io/projected/76e4bae7-5083-477d-ac35-4ab579a104ba-kube-api-access-fkqjw\") pod \"76e4bae7-5083-477d-ac35-4ab579a104ba\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.574156 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76e4bae7-5083-477d-ac35-4ab579a104ba-config-volume\") pod \"76e4bae7-5083-477d-ac35-4ab579a104ba\" (UID: \"76e4bae7-5083-477d-ac35-4ab579a104ba\") " Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.574279 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.574459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-catalog-content\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.574484 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-874j8\" (UniqueName: \"kubernetes.io/projected/7c8aeb7a-213a-4677-9877-69a57de9d13a-kube-api-access-874j8\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.574526 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-utilities\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.575039 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-utilities\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.575739 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:53.075717468 +0000 UTC m=+160.145308709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.576333 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-catalog-content\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.576747 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76e4bae7-5083-477d-ac35-4ab579a104ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "76e4bae7-5083-477d-ac35-4ab579a104ba" (UID: "76e4bae7-5083-477d-ac35-4ab579a104ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.580899 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e4bae7-5083-477d-ac35-4ab579a104ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76e4bae7-5083-477d-ac35-4ab579a104ba" (UID: "76e4bae7-5083-477d-ac35-4ab579a104ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.583258 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e4bae7-5083-477d-ac35-4ab579a104ba-kube-api-access-fkqjw" (OuterVolumeSpecName: "kube-api-access-fkqjw") pod "76e4bae7-5083-477d-ac35-4ab579a104ba" (UID: "76e4bae7-5083-477d-ac35-4ab579a104ba"). InnerVolumeSpecName "kube-api-access-fkqjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.600179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-874j8\" (UniqueName: \"kubernetes.io/projected/7c8aeb7a-213a-4677-9877-69a57de9d13a-kube-api-access-874j8\") pod \"community-operators-zzmww\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.631995 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hgtsw"] Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.632237 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e4bae7-5083-477d-ac35-4ab579a104ba" containerName="collect-profiles" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.632256 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e4bae7-5083-477d-ac35-4ab579a104ba" containerName="collect-profiles" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.632382 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e4bae7-5083-477d-ac35-4ab579a104ba" containerName="collect-profiles" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.637269 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgtsw"] Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.637412 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.675520 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-catalog-content\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.675547 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8rb7\" (UniqueName: \"kubernetes.io/projected/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-kube-api-access-p8rb7\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.675577 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.675650 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-utilities\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.675682 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76e4bae7-5083-477d-ac35-4ab579a104ba-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.675707 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76e4bae7-5083-477d-ac35-4ab579a104ba-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.675717 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkqjw\" (UniqueName: \"kubernetes.io/projected/76e4bae7-5083-477d-ac35-4ab579a104ba-kube-api-access-fkqjw\") on node \"crc\" DevicePath \"\"" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.675949 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:53.17593747 +0000 UTC m=+160.245528711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.776604 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.776848 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-10 06:53:53.276817019 +0000 UTC m=+160.346408260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.776902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-utilities\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.776987 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8rb7\" (UniqueName: \"kubernetes.io/projected/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-kube-api-access-p8rb7\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.777010 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-catalog-content\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.777053 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:52 crc kubenswrapper[4732]: E1010 06:53:52.777470 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-10 06:53:53.277463356 +0000 UTC m=+160.347054597 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-8tq5m" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.777874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-catalog-content\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.778735 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-utilities\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.783439 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:52 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:52 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:52 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.783480 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.798018 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8rb7\" (UniqueName: \"kubernetes.io/projected/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-kube-api-access-p8rb7\") pod \"certified-operators-hgtsw\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.814968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtdln"] Oct 10 06:53:52 crc kubenswrapper[4732]: W1010 06:53:52.823209 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9613d7c5_01ef_4d3d_8ebe_c2dab4e83b4d.slice/crio-405e10c6baf4dca96285a5406764bee0a2fb9810d55da2b25d57f90c84e49d8f WatchSource:0}: Error finding container 405e10c6baf4dca96285a5406764bee0a2fb9810d55da2b25d57f90c84e49d8f: Status 404 returned error can't find the container with id 405e10c6baf4dca96285a5406764bee0a2fb9810d55da2b25d57f90c84e49d8f Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.825511 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f9cdm"] Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.826510 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.829193 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.856500 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9cdm"] Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.856590 4732 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-10T06:53:52.494285542Z","Handler":null,"Name":""} Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.866507 4732 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.866554 4732 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.881254 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.881584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84mwh\" (UniqueName: \"kubernetes.io/projected/e249575a-4aa6-40df-ab92-1a72d840a00b-kube-api-access-84mwh\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.881649 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-catalog-content\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.881739 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-utilities\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.909300 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.957759 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.982792 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84mwh\" (UniqueName: \"kubernetes.io/projected/e249575a-4aa6-40df-ab92-1a72d840a00b-kube-api-access-84mwh\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.982874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.982904 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-catalog-content\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.982973 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-utilities\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.983456 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-utilities\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.983645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-catalog-content\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.992107 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 06:53:52 crc kubenswrapper[4732]: I1010 06:53:52.992145 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.003388 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84mwh\" (UniqueName: \"kubernetes.io/projected/e249575a-4aa6-40df-ab92-1a72d840a00b-kube-api-access-84mwh\") pod \"community-operators-f9cdm\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.049718 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-8tq5m\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.158100 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.213600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" event={"ID":"76e4bae7-5083-477d-ac35-4ab579a104ba","Type":"ContainerDied","Data":"1ac9e5c1806c0d2d951fe6104e6e0f45c0c4414e81e6ad772f716e5bad16a491"} Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.213660 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac9e5c1806c0d2d951fe6104e6e0f45c0c4414e81e6ad772f716e5bad16a491" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.213822 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.218670 4732 generic.go:334] "Generic (PLEG): container finished" podID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerID="7b8c8e45af5ec9f1f11423fa22eb324c32fbd9b5218f56a40f210b51fa21fe5d" exitCode=0 Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.218943 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdln" event={"ID":"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d","Type":"ContainerDied","Data":"7b8c8e45af5ec9f1f11423fa22eb324c32fbd9b5218f56a40f210b51fa21fe5d"} Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.219040 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdln" event={"ID":"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d","Type":"ContainerStarted","Data":"405e10c6baf4dca96285a5406764bee0a2fb9810d55da2b25d57f90c84e49d8f"} Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.223493 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.227042 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.229793 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" event={"ID":"5830df43-dc59-4583-8492-fddb895a4266","Type":"ContainerStarted","Data":"7ed63c823d4674b0ee57d69bb746029a114841c0faefe1f4e8ca094172f7a1e5"} Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.229848 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" event={"ID":"5830df43-dc59-4583-8492-fddb895a4266","Type":"ContainerStarted","Data":"d35f8a368fb0b67a65865684fc313c81911d33a7e7cd67c90965e86aca488f28"} Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.254428 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hgtsw"] Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.267813 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-gsx4x" podStartSLOduration=11.26779448 podStartE2EDuration="11.26779448s" podCreationTimestamp="2025-10-10 06:53:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:53.267212065 +0000 UTC m=+160.336803326" watchObservedRunningTime="2025-10-10 06:53:53.26779448 +0000 UTC m=+160.337385721" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.376377 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zzmww"] Oct 10 06:53:53 crc kubenswrapper[4732]: W1010 06:53:53.394268 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c8aeb7a_213a_4677_9877_69a57de9d13a.slice/crio-c9b0b24c54bafcc307e4e41fab91c4e84a0292ca5e9ad8631d5d4f3e32fb9236 WatchSource:0}: Error finding container c9b0b24c54bafcc307e4e41fab91c4e84a0292ca5e9ad8631d5d4f3e32fb9236: Status 404 returned error can't find the container with id c9b0b24c54bafcc307e4e41fab91c4e84a0292ca5e9ad8631d5d4f3e32fb9236 Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.504817 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8tq5m"] Oct 10 06:53:53 crc kubenswrapper[4732]: W1010 06:53:53.521920 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3d2fadd_7bf3_40b3_9c28_69a63eeb3945.slice/crio-b1efda1d0864183d1f6fa3d420e62ef55042a556bbd3af357d69d7b32378a008 WatchSource:0}: Error finding container b1efda1d0864183d1f6fa3d420e62ef55042a556bbd3af357d69d7b32378a008: Status 404 returned error can't find the container with id b1efda1d0864183d1f6fa3d420e62ef55042a556bbd3af357d69d7b32378a008 Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.648824 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f9cdm"] Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.673043 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 10 06:53:53 crc kubenswrapper[4732]: W1010 06:53:53.678934 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode249575a_4aa6_40df_ab92_1a72d840a00b.slice/crio-b487d345c8393969a42585c8260bd5df0bdf818eea7bf01a3ec30e92b95b17d4 WatchSource:0}: Error finding container b487d345c8393969a42585c8260bd5df0bdf818eea7bf01a3ec30e92b95b17d4: Status 404 returned error can't find the container with id b487d345c8393969a42585c8260bd5df0bdf818eea7bf01a3ec30e92b95b17d4 Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.784242 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:53 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:53 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:53 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.784300 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.988803 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.988927 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:53 crc kubenswrapper[4732]: I1010 06:53:53.995911 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.168808 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7dnbb" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.237639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" event={"ID":"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945","Type":"ContainerStarted","Data":"c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.237731 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" event={"ID":"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945","Type":"ContainerStarted","Data":"b1efda1d0864183d1f6fa3d420e62ef55042a556bbd3af357d69d7b32378a008"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.237753 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.240478 4732 generic.go:334] "Generic (PLEG): container finished" podID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerID="3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805" exitCode=0 Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.240573 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgtsw" event={"ID":"51d8b2a9-41c8-43ce-b4d6-accaaea69afb","Type":"ContainerDied","Data":"3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.240614 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgtsw" event={"ID":"51d8b2a9-41c8-43ce-b4d6-accaaea69afb","Type":"ContainerStarted","Data":"23d146a92138ecf4793be1966892d1bc19d85be1717ef1858c9bb7134bcbe380"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.243957 4732 generic.go:334] "Generic (PLEG): container finished" podID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerID="1f822d7a5347133d62ee7db592f0d0892b78a9f37aecf9608a703350f0738486" exitCode=0 Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.243986 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzmww" event={"ID":"7c8aeb7a-213a-4677-9877-69a57de9d13a","Type":"ContainerDied","Data":"1f822d7a5347133d62ee7db592f0d0892b78a9f37aecf9608a703350f0738486"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.244022 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzmww" event={"ID":"7c8aeb7a-213a-4677-9877-69a57de9d13a","Type":"ContainerStarted","Data":"c9b0b24c54bafcc307e4e41fab91c4e84a0292ca5e9ad8631d5d4f3e32fb9236"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.245766 4732 generic.go:334] "Generic (PLEG): container finished" podID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerID="ea8ea8c500db34dcffb3c38937267888369be05d1ed9e36f5931f72949651d57" exitCode=0 Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.245844 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9cdm" event={"ID":"e249575a-4aa6-40df-ab92-1a72d840a00b","Type":"ContainerDied","Data":"ea8ea8c500db34dcffb3c38937267888369be05d1ed9e36f5931f72949651d57"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.245873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9cdm" event={"ID":"e249575a-4aa6-40df-ab92-1a72d840a00b","Type":"ContainerStarted","Data":"b487d345c8393969a42585c8260bd5df0bdf818eea7bf01a3ec30e92b95b17d4"} Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.253741 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2lfbx" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.255345 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" podStartSLOduration=136.255282909 podStartE2EDuration="2m16.255282909s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:53:54.254009365 +0000 UTC m=+161.323600616" watchObservedRunningTime="2025-10-10 06:53:54.255282909 +0000 UTC m=+161.324874150" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.377200 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.377848 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.381047 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.381302 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.384687 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.421052 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.421108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.436362 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z627g"] Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.437357 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.452717 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z627g"] Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.453777 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.522195 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.522268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kggmn\" (UniqueName: \"kubernetes.io/projected/4956619d-c22f-4cd0-983b-70aeb971dde7-kube-api-access-kggmn\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.522287 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-catalog-content\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.522304 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.522361 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-utilities\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.523234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.559142 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.622972 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-utilities\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.623033 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kggmn\" (UniqueName: \"kubernetes.io/projected/4956619d-c22f-4cd0-983b-70aeb971dde7-kube-api-access-kggmn\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.623052 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-catalog-content\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.623436 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-catalog-content\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.623580 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-utilities\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.652994 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kggmn\" (UniqueName: \"kubernetes.io/projected/4956619d-c22f-4cd0-983b-70aeb971dde7-kube-api-access-kggmn\") pod \"redhat-marketplace-z627g\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.734529 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.765080 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.784441 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:54 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:54 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:54 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.784493 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.825350 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8ft97"] Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.830172 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.832842 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ft97"] Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.930829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-utilities\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.930952 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h777j\" (UniqueName: \"kubernetes.io/projected/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-kube-api-access-h777j\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:54 crc kubenswrapper[4732]: I1010 06:53:54.931055 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-catalog-content\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.004081 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.032431 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-utilities\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.032469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h777j\" (UniqueName: \"kubernetes.io/projected/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-kube-api-access-h777j\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.032534 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-catalog-content\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.033168 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-catalog-content\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.033644 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-utilities\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.074962 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h777j\" (UniqueName: \"kubernetes.io/projected/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-kube-api-access-h777j\") pod \"redhat-marketplace-8ft97\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.085211 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z627g"] Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.089920 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7zpr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.090013 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-f7zpr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.090029 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-f7zpr" podUID="45674fdf-b85c-4d66-afc3-b0fad73523da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.090068 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f7zpr" podUID="45674fdf-b85c-4d66-afc3-b0fad73523da" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.19:8080/\": dial tcp 10.217.0.19:8080: connect: connection refused" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.167979 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.335361 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"97f9024d-5fe4-4c0d-9fe4-7544a948f085","Type":"ContainerStarted","Data":"bd1c219f42a0e081e8f0fb81299ce0b575e426fc12652440864bbd12c5fc410f"} Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.341929 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z627g" event={"ID":"4956619d-c22f-4cd0-983b-70aeb971dde7","Type":"ContainerStarted","Data":"6016a20a483bcac9d5efbced4d01d4dda6099b8619344182a5ffc21c5e280c48"} Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.355952 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.356031 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.369816 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.369864 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.371675 4732 patch_prober.go:28] interesting pod/console-f9d7485db-kg7gq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.371774 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-kg7gq" podUID="e7a62711-6cb6-4867-a232-8b8b043faa74" containerName="console" probeResult="failure" output="Get \"https://10.217.0.13:8443/health\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.425761 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5m96q"] Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.427843 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.429606 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.443063 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5m96q"] Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.538839 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5pgx\" (UniqueName: \"kubernetes.io/projected/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-kube-api-access-h5pgx\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.538894 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-catalog-content\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.538960 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-utilities\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.639757 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-utilities\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.639882 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5pgx\" (UniqueName: \"kubernetes.io/projected/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-kube-api-access-h5pgx\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.639946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-catalog-content\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.640487 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-catalog-content\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.640565 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-utilities\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.686003 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5pgx\" (UniqueName: \"kubernetes.io/projected/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-kube-api-access-h5pgx\") pod \"redhat-operators-5m96q\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.780736 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.784166 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:55 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:55 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:55 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.784213 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.812920 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ft97"] Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.815885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.824279 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cpwlv"] Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.825487 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.832434 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpwlv"] Oct 10 06:53:55 crc kubenswrapper[4732]: W1010 06:53:55.944298 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02bfa9a7_ce97_4ec1_a20a_171d59d53f08.slice/crio-69e86c390e28e69c4dbf289085ea041306248dfb6cf54b492921c35aa2036934 WatchSource:0}: Error finding container 69e86c390e28e69c4dbf289085ea041306248dfb6cf54b492921c35aa2036934: Status 404 returned error can't find the container with id 69e86c390e28e69c4dbf289085ea041306248dfb6cf54b492921c35aa2036934 Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.946600 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xhr6\" (UniqueName: \"kubernetes.io/projected/71937edd-921b-491b-96b9-0c48117ae2ce-kube-api-access-7xhr6\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.946756 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-utilities\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:55 crc kubenswrapper[4732]: I1010 06:53:55.946786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-catalog-content\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.048266 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-utilities\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.048302 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-catalog-content\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.048386 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xhr6\" (UniqueName: \"kubernetes.io/projected/71937edd-921b-491b-96b9-0c48117ae2ce-kube-api-access-7xhr6\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.049001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-utilities\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.049299 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-catalog-content\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.064267 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xhr6\" (UniqueName: \"kubernetes.io/projected/71937edd-921b-491b-96b9-0c48117ae2ce-kube-api-access-7xhr6\") pod \"redhat-operators-cpwlv\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.147112 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.200766 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5m96q"] Oct 10 06:53:56 crc kubenswrapper[4732]: W1010 06:53:56.230830 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cfde542_d31f_4d68_a738_a2fcdbddfbeb.slice/crio-f75a67fe0c7039082d609e6f95c6e843655d327e0a288bb188733c0dc5b0f841 WatchSource:0}: Error finding container f75a67fe0c7039082d609e6f95c6e843655d327e0a288bb188733c0dc5b0f841: Status 404 returned error can't find the container with id f75a67fe0c7039082d609e6f95c6e843655d327e0a288bb188733c0dc5b0f841 Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.362892 4732 generic.go:334] "Generic (PLEG): container finished" podID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerID="9417bea2d553de16dcf61e2919f47961933374504f250c92b0a9cbe2a27930cc" exitCode=0 Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.362973 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z627g" event={"ID":"4956619d-c22f-4cd0-983b-70aeb971dde7","Type":"ContainerDied","Data":"9417bea2d553de16dcf61e2919f47961933374504f250c92b0a9cbe2a27930cc"} Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.366096 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5m96q" event={"ID":"5cfde542-d31f-4d68-a738-a2fcdbddfbeb","Type":"ContainerStarted","Data":"f75a67fe0c7039082d609e6f95c6e843655d327e0a288bb188733c0dc5b0f841"} Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.367613 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ft97" event={"ID":"02bfa9a7-ce97-4ec1-a20a-171d59d53f08","Type":"ContainerStarted","Data":"69e86c390e28e69c4dbf289085ea041306248dfb6cf54b492921c35aa2036934"} Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.373816 4732 generic.go:334] "Generic (PLEG): container finished" podID="97f9024d-5fe4-4c0d-9fe4-7544a948f085" containerID="5f753f616845efa1af708217ce9ddebb0867662c8622b24949803a5ac63247d4" exitCode=0 Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.373853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"97f9024d-5fe4-4c0d-9fe4-7544a948f085","Type":"ContainerDied","Data":"5f753f616845efa1af708217ce9ddebb0867662c8622b24949803a5ac63247d4"} Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.459749 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cpwlv"] Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.481750 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.483430 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.486212 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.488900 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.489905 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 10 06:53:56 crc kubenswrapper[4732]: W1010 06:53:56.504237 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71937edd_921b_491b_96b9_0c48117ae2ce.slice/crio-47ce5b734febbf8ae3e317c59bd33dccf78ec0143dfa4d4e5e7d0426ef3a3371 WatchSource:0}: Error finding container 47ce5b734febbf8ae3e317c59bd33dccf78ec0143dfa4d4e5e7d0426ef3a3371: Status 404 returned error can't find the container with id 47ce5b734febbf8ae3e317c59bd33dccf78ec0143dfa4d4e5e7d0426ef3a3371 Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.554299 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/786942b2-53ca-4df8-8d52-0324504dbd3c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.554460 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/786942b2-53ca-4df8-8d52-0324504dbd3c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.655315 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/786942b2-53ca-4df8-8d52-0324504dbd3c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.655409 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/786942b2-53ca-4df8-8d52-0324504dbd3c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.655556 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/786942b2-53ca-4df8-8d52-0324504dbd3c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.674906 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/786942b2-53ca-4df8-8d52-0324504dbd3c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.793161 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:56 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:56 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:56 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.793385 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:56 crc kubenswrapper[4732]: I1010 06:53:56.823253 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.290233 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.407629 4732 generic.go:334] "Generic (PLEG): container finished" podID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerID="2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec" exitCode=0 Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.407742 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ft97" event={"ID":"02bfa9a7-ce97-4ec1-a20a-171d59d53f08","Type":"ContainerDied","Data":"2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec"} Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.417324 4732 generic.go:334] "Generic (PLEG): container finished" podID="71937edd-921b-491b-96b9-0c48117ae2ce" containerID="8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859" exitCode=0 Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.417583 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpwlv" event={"ID":"71937edd-921b-491b-96b9-0c48117ae2ce","Type":"ContainerDied","Data":"8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859"} Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.417662 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpwlv" event={"ID":"71937edd-921b-491b-96b9-0c48117ae2ce","Type":"ContainerStarted","Data":"47ce5b734febbf8ae3e317c59bd33dccf78ec0143dfa4d4e5e7d0426ef3a3371"} Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.421307 4732 generic.go:334] "Generic (PLEG): container finished" podID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerID="6e46140a2b248830b89cf579bfc53a3861a5004f4ab4b9274b2f8eedce3c61af" exitCode=0 Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.421349 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5m96q" event={"ID":"5cfde542-d31f-4d68-a738-a2fcdbddfbeb","Type":"ContainerDied","Data":"6e46140a2b248830b89cf579bfc53a3861a5004f4ab4b9274b2f8eedce3c61af"} Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.424541 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"786942b2-53ca-4df8-8d52-0324504dbd3c","Type":"ContainerStarted","Data":"9ed4537f1a08005e9dcd7c1eb193fdce68b7944ce76fa20621d0c0a34b3560c7"} Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.735321 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.778005 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kubelet-dir\") pod \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.778089 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "97f9024d-5fe4-4c0d-9fe4-7544a948f085" (UID: "97f9024d-5fe4-4c0d-9fe4-7544a948f085"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.778139 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kube-api-access\") pod \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\" (UID: \"97f9024d-5fe4-4c0d-9fe4-7544a948f085\") " Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.778365 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.785978 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "97f9024d-5fe4-4c0d-9fe4-7544a948f085" (UID: "97f9024d-5fe4-4c0d-9fe4-7544a948f085"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.792541 4732 patch_prober.go:28] interesting pod/router-default-5444994796-wkrlj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 10 06:53:57 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Oct 10 06:53:57 crc kubenswrapper[4732]: [+]process-running ok Oct 10 06:53:57 crc kubenswrapper[4732]: healthz check failed Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.792623 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wkrlj" podUID="471e7f45-877a-4cba-8e27-b2a249dac74e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 10 06:53:57 crc kubenswrapper[4732]: I1010 06:53:57.879568 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/97f9024d-5fe4-4c0d-9fe4-7544a948f085-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:53:58 crc kubenswrapper[4732]: I1010 06:53:58.431796 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"786942b2-53ca-4df8-8d52-0324504dbd3c","Type":"ContainerStarted","Data":"5190d922dfd3e45a860b25b5f18dd67ae771bd85a0fa1b6a736261f68b0cffc1"} Oct 10 06:53:58 crc kubenswrapper[4732]: I1010 06:53:58.434860 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"97f9024d-5fe4-4c0d-9fe4-7544a948f085","Type":"ContainerDied","Data":"bd1c219f42a0e081e8f0fb81299ce0b575e426fc12652440864bbd12c5fc410f"} Oct 10 06:53:58 crc kubenswrapper[4732]: I1010 06:53:58.434883 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd1c219f42a0e081e8f0fb81299ce0b575e426fc12652440864bbd12c5fc410f" Oct 10 06:53:58 crc kubenswrapper[4732]: I1010 06:53:58.434903 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 10 06:53:58 crc kubenswrapper[4732]: I1010 06:53:58.783522 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:58 crc kubenswrapper[4732]: I1010 06:53:58.786875 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wkrlj" Oct 10 06:53:59 crc kubenswrapper[4732]: I1010 06:53:59.445732 4732 generic.go:334] "Generic (PLEG): container finished" podID="786942b2-53ca-4df8-8d52-0324504dbd3c" containerID="5190d922dfd3e45a860b25b5f18dd67ae771bd85a0fa1b6a736261f68b0cffc1" exitCode=0 Oct 10 06:53:59 crc kubenswrapper[4732]: I1010 06:53:59.445885 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"786942b2-53ca-4df8-8d52-0324504dbd3c","Type":"ContainerDied","Data":"5190d922dfd3e45a860b25b5f18dd67ae771bd85a0fa1b6a736261f68b0cffc1"} Oct 10 06:54:00 crc kubenswrapper[4732]: I1010 06:54:00.415259 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:54:00 crc kubenswrapper[4732]: I1010 06:54:00.442110 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77abff23-1622-4219-a841-49fe8dbb6cc3-metrics-certs\") pod \"network-metrics-daemon-mj7bk\" (UID: \"77abff23-1622-4219-a841-49fe8dbb6cc3\") " pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:54:00 crc kubenswrapper[4732]: I1010 06:54:00.587382 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mj7bk" Oct 10 06:54:00 crc kubenswrapper[4732]: I1010 06:54:00.675821 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ssm2l" Oct 10 06:54:05 crc kubenswrapper[4732]: I1010 06:54:05.098526 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-f7zpr" Oct 10 06:54:05 crc kubenswrapper[4732]: I1010 06:54:05.379733 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:54:05 crc kubenswrapper[4732]: I1010 06:54:05.392507 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.119376 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.205335 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/786942b2-53ca-4df8-8d52-0324504dbd3c-kube-api-access\") pod \"786942b2-53ca-4df8-8d52-0324504dbd3c\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.205414 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/786942b2-53ca-4df8-8d52-0324504dbd3c-kubelet-dir\") pod \"786942b2-53ca-4df8-8d52-0324504dbd3c\" (UID: \"786942b2-53ca-4df8-8d52-0324504dbd3c\") " Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.205562 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/786942b2-53ca-4df8-8d52-0324504dbd3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "786942b2-53ca-4df8-8d52-0324504dbd3c" (UID: "786942b2-53ca-4df8-8d52-0324504dbd3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.213223 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/786942b2-53ca-4df8-8d52-0324504dbd3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "786942b2-53ca-4df8-8d52-0324504dbd3c" (UID: "786942b2-53ca-4df8-8d52-0324504dbd3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.307142 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/786942b2-53ca-4df8-8d52-0324504dbd3c-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.307184 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/786942b2-53ca-4df8-8d52-0324504dbd3c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.523208 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"786942b2-53ca-4df8-8d52-0324504dbd3c","Type":"ContainerDied","Data":"9ed4537f1a08005e9dcd7c1eb193fdce68b7944ce76fa20621d0c0a34b3560c7"} Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.523252 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed4537f1a08005e9dcd7c1eb193fdce68b7944ce76fa20621d0c0a34b3560c7" Oct 10 06:54:07 crc kubenswrapper[4732]: I1010 06:54:07.523255 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 10 06:54:13 crc kubenswrapper[4732]: I1010 06:54:13.231654 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 06:54:19 crc kubenswrapper[4732]: E1010 06:54:19.549975 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 10 06:54:19 crc kubenswrapper[4732]: E1010 06:54:19.550666 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-874j8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zzmww_openshift-marketplace(7c8aeb7a-213a-4677-9877-69a57de9d13a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:54:19 crc kubenswrapper[4732]: E1010 06:54:19.552145 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zzmww" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" Oct 10 06:54:20 crc kubenswrapper[4732]: I1010 06:54:20.808858 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 10 06:54:23 crc kubenswrapper[4732]: E1010 06:54:23.968319 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zzmww" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" Oct 10 06:54:24 crc kubenswrapper[4732]: E1010 06:54:24.719328 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 10 06:54:24 crc kubenswrapper[4732]: E1010 06:54:24.719730 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h777j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8ft97_openshift-marketplace(02bfa9a7-ce97-4ec1-a20a-171d59d53f08): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:54:24 crc kubenswrapper[4732]: E1010 06:54:24.721516 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8ft97" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" Oct 10 06:54:25 crc kubenswrapper[4732]: I1010 06:54:25.355742 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:54:25 crc kubenswrapper[4732]: I1010 06:54:25.356305 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:54:25 crc kubenswrapper[4732]: I1010 06:54:25.656896 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-v7gcd" Oct 10 06:54:26 crc kubenswrapper[4732]: E1010 06:54:26.726731 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8ft97" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" Oct 10 06:54:26 crc kubenswrapper[4732]: E1010 06:54:26.892089 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 10 06:54:26 crc kubenswrapper[4732]: E1010 06:54:26.892489 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8rb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hgtsw_openshift-marketplace(51d8b2a9-41c8-43ce-b4d6-accaaea69afb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:54:26 crc kubenswrapper[4732]: E1010 06:54:26.893783 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hgtsw" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" Oct 10 06:54:26 crc kubenswrapper[4732]: I1010 06:54:26.906681 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mj7bk"] Oct 10 06:54:26 crc kubenswrapper[4732]: W1010 06:54:26.915610 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77abff23_1622_4219_a841_49fe8dbb6cc3.slice/crio-c72bcd54989d4ca6da61ada2156bf7ce783d283ac53a85a59ee874c56f420dd2 WatchSource:0}: Error finding container c72bcd54989d4ca6da61ada2156bf7ce783d283ac53a85a59ee874c56f420dd2: Status 404 returned error can't find the container with id c72bcd54989d4ca6da61ada2156bf7ce783d283ac53a85a59ee874c56f420dd2 Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.113532 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.113733 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-brq5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gtdln_openshift-marketplace(9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.114981 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gtdln" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.415720 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.416575 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kggmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-z627g_openshift-marketplace(4956619d-c22f-4cd0-983b-70aeb971dde7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.418555 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-z627g" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" Oct 10 06:54:27 crc kubenswrapper[4732]: I1010 06:54:27.650266 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" event={"ID":"77abff23-1622-4219-a841-49fe8dbb6cc3","Type":"ContainerStarted","Data":"66c82d2cd3e1a66fe0d9477af58f90e8355f585f4da5c90f862fabc121830e04"} Oct 10 06:54:27 crc kubenswrapper[4732]: I1010 06:54:27.650326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" event={"ID":"77abff23-1622-4219-a841-49fe8dbb6cc3","Type":"ContainerStarted","Data":"c72bcd54989d4ca6da61ada2156bf7ce783d283ac53a85a59ee874c56f420dd2"} Oct 10 06:54:27 crc kubenswrapper[4732]: I1010 06:54:27.652752 4732 generic.go:334] "Generic (PLEG): container finished" podID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerID="92001f970c5e4f885e7f7390bef915bad061ca2b056b38b53bff3982fbf7e798" exitCode=0 Oct 10 06:54:27 crc kubenswrapper[4732]: I1010 06:54:27.652807 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9cdm" event={"ID":"e249575a-4aa6-40df-ab92-1a72d840a00b","Type":"ContainerDied","Data":"92001f970c5e4f885e7f7390bef915bad061ca2b056b38b53bff3982fbf7e798"} Oct 10 06:54:27 crc kubenswrapper[4732]: I1010 06:54:27.655975 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpwlv" event={"ID":"71937edd-921b-491b-96b9-0c48117ae2ce","Type":"ContainerStarted","Data":"a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb"} Oct 10 06:54:27 crc kubenswrapper[4732]: I1010 06:54:27.667242 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5m96q" event={"ID":"5cfde542-d31f-4d68-a738-a2fcdbddfbeb","Type":"ContainerStarted","Data":"eb19c5200f3165ca37a733568b35389d0d5f8acc02bb8b53d2a35594ea302535"} Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.667728 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gtdln" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.669055 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-z627g" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" Oct 10 06:54:27 crc kubenswrapper[4732]: E1010 06:54:27.672603 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hgtsw" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" Oct 10 06:54:28 crc kubenswrapper[4732]: I1010 06:54:28.675869 4732 generic.go:334] "Generic (PLEG): container finished" podID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerID="eb19c5200f3165ca37a733568b35389d0d5f8acc02bb8b53d2a35594ea302535" exitCode=0 Oct 10 06:54:28 crc kubenswrapper[4732]: I1010 06:54:28.675951 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5m96q" event={"ID":"5cfde542-d31f-4d68-a738-a2fcdbddfbeb","Type":"ContainerDied","Data":"eb19c5200f3165ca37a733568b35389d0d5f8acc02bb8b53d2a35594ea302535"} Oct 10 06:54:28 crc kubenswrapper[4732]: I1010 06:54:28.685497 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mj7bk" event={"ID":"77abff23-1622-4219-a841-49fe8dbb6cc3","Type":"ContainerStarted","Data":"eecc3d8eb10fe772a25fb749d48ec5c4200f6ed96dfed13ffc168fd4450357ef"} Oct 10 06:54:28 crc kubenswrapper[4732]: I1010 06:54:28.688922 4732 generic.go:334] "Generic (PLEG): container finished" podID="71937edd-921b-491b-96b9-0c48117ae2ce" containerID="a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb" exitCode=0 Oct 10 06:54:28 crc kubenswrapper[4732]: I1010 06:54:28.688973 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpwlv" event={"ID":"71937edd-921b-491b-96b9-0c48117ae2ce","Type":"ContainerDied","Data":"a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb"} Oct 10 06:54:28 crc kubenswrapper[4732]: I1010 06:54:28.728617 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mj7bk" podStartSLOduration=170.728583293 podStartE2EDuration="2m50.728583293s" podCreationTimestamp="2025-10-10 06:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:54:28.724345321 +0000 UTC m=+195.793936592" watchObservedRunningTime="2025-10-10 06:54:28.728583293 +0000 UTC m=+195.798174534" Oct 10 06:54:29 crc kubenswrapper[4732]: I1010 06:54:29.695235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9cdm" event={"ID":"e249575a-4aa6-40df-ab92-1a72d840a00b","Type":"ContainerStarted","Data":"a1720bcc89406b2c6865902717f1f15e00cfe3f995bb2075ec6b5a303dfe1b7a"} Oct 10 06:54:30 crc kubenswrapper[4732]: I1010 06:54:30.705350 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpwlv" event={"ID":"71937edd-921b-491b-96b9-0c48117ae2ce","Type":"ContainerStarted","Data":"c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca"} Oct 10 06:54:30 crc kubenswrapper[4732]: I1010 06:54:30.732895 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f9cdm" podStartSLOduration=4.365647761 podStartE2EDuration="38.73287937s" podCreationTimestamp="2025-10-10 06:53:52 +0000 UTC" firstStartedPulling="2025-10-10 06:53:54.247793341 +0000 UTC m=+161.317384582" lastFinishedPulling="2025-10-10 06:54:28.61502496 +0000 UTC m=+195.684616191" observedRunningTime="2025-10-10 06:54:29.717108069 +0000 UTC m=+196.786699330" watchObservedRunningTime="2025-10-10 06:54:30.73287937 +0000 UTC m=+197.802470611" Oct 10 06:54:31 crc kubenswrapper[4732]: I1010 06:54:31.710830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5m96q" event={"ID":"5cfde542-d31f-4d68-a738-a2fcdbddfbeb","Type":"ContainerStarted","Data":"1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0"} Oct 10 06:54:31 crc kubenswrapper[4732]: I1010 06:54:31.728959 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5m96q" podStartSLOduration=3.486281731 podStartE2EDuration="36.72893766s" podCreationTimestamp="2025-10-10 06:53:55 +0000 UTC" firstStartedPulling="2025-10-10 06:53:57.423385194 +0000 UTC m=+164.492976435" lastFinishedPulling="2025-10-10 06:54:30.666041123 +0000 UTC m=+197.735632364" observedRunningTime="2025-10-10 06:54:31.728357034 +0000 UTC m=+198.797948295" watchObservedRunningTime="2025-10-10 06:54:31.72893766 +0000 UTC m=+198.798528901" Oct 10 06:54:31 crc kubenswrapper[4732]: I1010 06:54:31.730734 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cpwlv" podStartSLOduration=3.938901457 podStartE2EDuration="36.730726178s" podCreationTimestamp="2025-10-10 06:53:55 +0000 UTC" firstStartedPulling="2025-10-10 06:53:57.419336677 +0000 UTC m=+164.488927918" lastFinishedPulling="2025-10-10 06:54:30.211161398 +0000 UTC m=+197.280752639" observedRunningTime="2025-10-10 06:54:30.733015993 +0000 UTC m=+197.802607254" watchObservedRunningTime="2025-10-10 06:54:31.730726178 +0000 UTC m=+198.800317429" Oct 10 06:54:33 crc kubenswrapper[4732]: I1010 06:54:33.159465 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:54:33 crc kubenswrapper[4732]: I1010 06:54:33.159538 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:54:33 crc kubenswrapper[4732]: I1010 06:54:33.502601 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:54:33 crc kubenswrapper[4732]: I1010 06:54:33.759081 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:54:35 crc kubenswrapper[4732]: I1010 06:54:35.028417 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9cdm"] Oct 10 06:54:35 crc kubenswrapper[4732]: I1010 06:54:35.728792 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f9cdm" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="registry-server" containerID="cri-o://a1720bcc89406b2c6865902717f1f15e00cfe3f995bb2075ec6b5a303dfe1b7a" gracePeriod=2 Oct 10 06:54:35 crc kubenswrapper[4732]: I1010 06:54:35.816297 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:54:35 crc kubenswrapper[4732]: I1010 06:54:35.816339 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:54:35 crc kubenswrapper[4732]: I1010 06:54:35.888410 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:54:36 crc kubenswrapper[4732]: I1010 06:54:36.150976 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:54:36 crc kubenswrapper[4732]: I1010 06:54:36.151287 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:54:36 crc kubenswrapper[4732]: I1010 06:54:36.193935 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:54:36 crc kubenswrapper[4732]: I1010 06:54:36.735235 4732 generic.go:334] "Generic (PLEG): container finished" podID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerID="a1720bcc89406b2c6865902717f1f15e00cfe3f995bb2075ec6b5a303dfe1b7a" exitCode=0 Oct 10 06:54:36 crc kubenswrapper[4732]: I1010 06:54:36.735327 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9cdm" event={"ID":"e249575a-4aa6-40df-ab92-1a72d840a00b","Type":"ContainerDied","Data":"a1720bcc89406b2c6865902717f1f15e00cfe3f995bb2075ec6b5a303dfe1b7a"} Oct 10 06:54:36 crc kubenswrapper[4732]: I1010 06:54:36.772277 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:54:36 crc kubenswrapper[4732]: I1010 06:54:36.774588 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.081042 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.207072 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-catalog-content\") pod \"e249575a-4aa6-40df-ab92-1a72d840a00b\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.207130 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-utilities\") pod \"e249575a-4aa6-40df-ab92-1a72d840a00b\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.207190 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84mwh\" (UniqueName: \"kubernetes.io/projected/e249575a-4aa6-40df-ab92-1a72d840a00b-kube-api-access-84mwh\") pod \"e249575a-4aa6-40df-ab92-1a72d840a00b\" (UID: \"e249575a-4aa6-40df-ab92-1a72d840a00b\") " Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.208042 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-utilities" (OuterVolumeSpecName: "utilities") pod "e249575a-4aa6-40df-ab92-1a72d840a00b" (UID: "e249575a-4aa6-40df-ab92-1a72d840a00b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.208259 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.218476 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e249575a-4aa6-40df-ab92-1a72d840a00b-kube-api-access-84mwh" (OuterVolumeSpecName: "kube-api-access-84mwh") pod "e249575a-4aa6-40df-ab92-1a72d840a00b" (UID: "e249575a-4aa6-40df-ab92-1a72d840a00b"). InnerVolumeSpecName "kube-api-access-84mwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.266086 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e249575a-4aa6-40df-ab92-1a72d840a00b" (UID: "e249575a-4aa6-40df-ab92-1a72d840a00b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.309856 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84mwh\" (UniqueName: \"kubernetes.io/projected/e249575a-4aa6-40df-ab92-1a72d840a00b-kube-api-access-84mwh\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.309893 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e249575a-4aa6-40df-ab92-1a72d840a00b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.741751 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f9cdm" event={"ID":"e249575a-4aa6-40df-ab92-1a72d840a00b","Type":"ContainerDied","Data":"b487d345c8393969a42585c8260bd5df0bdf818eea7bf01a3ec30e92b95b17d4"} Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.741828 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f9cdm" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.741835 4732 scope.go:117] "RemoveContainer" containerID="a1720bcc89406b2c6865902717f1f15e00cfe3f995bb2075ec6b5a303dfe1b7a" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.762555 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f9cdm"] Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.765425 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f9cdm"] Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.767674 4732 scope.go:117] "RemoveContainer" containerID="92001f970c5e4f885e7f7390bef915bad061ca2b056b38b53bff3982fbf7e798" Oct 10 06:54:37 crc kubenswrapper[4732]: I1010 06:54:37.784613 4732 scope.go:117] "RemoveContainer" containerID="ea8ea8c500db34dcffb3c38937267888369be05d1ed9e36f5931f72949651d57" Oct 10 06:54:38 crc kubenswrapper[4732]: I1010 06:54:38.227198 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpwlv"] Oct 10 06:54:38 crc kubenswrapper[4732]: I1010 06:54:38.752804 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cpwlv" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="registry-server" containerID="cri-o://c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca" gracePeriod=2 Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.662750 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.666510 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" path="/var/lib/kubelet/pods/e249575a-4aa6-40df-ab92-1a72d840a00b/volumes" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.740471 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xhr6\" (UniqueName: \"kubernetes.io/projected/71937edd-921b-491b-96b9-0c48117ae2ce-kube-api-access-7xhr6\") pod \"71937edd-921b-491b-96b9-0c48117ae2ce\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.740579 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-catalog-content\") pod \"71937edd-921b-491b-96b9-0c48117ae2ce\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.740847 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-utilities\") pod \"71937edd-921b-491b-96b9-0c48117ae2ce\" (UID: \"71937edd-921b-491b-96b9-0c48117ae2ce\") " Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.743167 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-utilities" (OuterVolumeSpecName: "utilities") pod "71937edd-921b-491b-96b9-0c48117ae2ce" (UID: "71937edd-921b-491b-96b9-0c48117ae2ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.748410 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71937edd-921b-491b-96b9-0c48117ae2ce-kube-api-access-7xhr6" (OuterVolumeSpecName: "kube-api-access-7xhr6") pod "71937edd-921b-491b-96b9-0c48117ae2ce" (UID: "71937edd-921b-491b-96b9-0c48117ae2ce"). InnerVolumeSpecName "kube-api-access-7xhr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.771010 4732 generic.go:334] "Generic (PLEG): container finished" podID="71937edd-921b-491b-96b9-0c48117ae2ce" containerID="c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca" exitCode=0 Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.771053 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpwlv" event={"ID":"71937edd-921b-491b-96b9-0c48117ae2ce","Type":"ContainerDied","Data":"c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca"} Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.771082 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cpwlv" event={"ID":"71937edd-921b-491b-96b9-0c48117ae2ce","Type":"ContainerDied","Data":"47ce5b734febbf8ae3e317c59bd33dccf78ec0143dfa4d4e5e7d0426ef3a3371"} Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.771104 4732 scope.go:117] "RemoveContainer" containerID="c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.771239 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cpwlv" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.786871 4732 scope.go:117] "RemoveContainer" containerID="a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.802041 4732 scope.go:117] "RemoveContainer" containerID="8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.813704 4732 scope.go:117] "RemoveContainer" containerID="c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca" Oct 10 06:54:39 crc kubenswrapper[4732]: E1010 06:54:39.814024 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca\": container with ID starting with c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca not found: ID does not exist" containerID="c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.814066 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca"} err="failed to get container status \"c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca\": rpc error: code = NotFound desc = could not find container \"c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca\": container with ID starting with c07bd07d5c6a53e0503e16021ef0e8bb1cedc9c6622d194032242aee6c0423ca not found: ID does not exist" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.814128 4732 scope.go:117] "RemoveContainer" containerID="a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb" Oct 10 06:54:39 crc kubenswrapper[4732]: E1010 06:54:39.814412 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb\": container with ID starting with a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb not found: ID does not exist" containerID="a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.814442 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb"} err="failed to get container status \"a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb\": rpc error: code = NotFound desc = could not find container \"a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb\": container with ID starting with a37843ddca2455b5c6e8b0049969341febea77d391242dc435235def997999bb not found: ID does not exist" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.814466 4732 scope.go:117] "RemoveContainer" containerID="8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859" Oct 10 06:54:39 crc kubenswrapper[4732]: E1010 06:54:39.814777 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859\": container with ID starting with 8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859 not found: ID does not exist" containerID="8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.814806 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859"} err="failed to get container status \"8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859\": rpc error: code = NotFound desc = could not find container \"8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859\": container with ID starting with 8261a0668a5a84cbc3767803b76046c641faf1ca54a349f6de6004fe44947859 not found: ID does not exist" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.842272 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xhr6\" (UniqueName: \"kubernetes.io/projected/71937edd-921b-491b-96b9-0c48117ae2ce-kube-api-access-7xhr6\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.842315 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.865879 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71937edd-921b-491b-96b9-0c48117ae2ce" (UID: "71937edd-921b-491b-96b9-0c48117ae2ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:54:39 crc kubenswrapper[4732]: I1010 06:54:39.944035 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71937edd-921b-491b-96b9-0c48117ae2ce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:40 crc kubenswrapper[4732]: I1010 06:54:40.164581 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cpwlv"] Oct 10 06:54:40 crc kubenswrapper[4732]: I1010 06:54:40.168162 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cpwlv"] Oct 10 06:54:40 crc kubenswrapper[4732]: I1010 06:54:40.779599 4732 generic.go:334] "Generic (PLEG): container finished" podID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerID="e7c5a18b2a21d3d93e7b6514a2b41e3119159c277d861dd6a035752668759c5f" exitCode=0 Oct 10 06:54:40 crc kubenswrapper[4732]: I1010 06:54:40.780376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzmww" event={"ID":"7c8aeb7a-213a-4677-9877-69a57de9d13a","Type":"ContainerDied","Data":"e7c5a18b2a21d3d93e7b6514a2b41e3119159c277d861dd6a035752668759c5f"} Oct 10 06:54:41 crc kubenswrapper[4732]: I1010 06:54:41.675426 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" path="/var/lib/kubelet/pods/71937edd-921b-491b-96b9-0c48117ae2ce/volumes" Oct 10 06:54:45 crc kubenswrapper[4732]: I1010 06:54:45.809618 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzmww" event={"ID":"7c8aeb7a-213a-4677-9877-69a57de9d13a","Type":"ContainerStarted","Data":"737e5a864b012a74d196b97f75f1f721a0832c726396659f0771984f7a10572b"} Oct 10 06:54:46 crc kubenswrapper[4732]: I1010 06:54:46.815097 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgtsw" event={"ID":"51d8b2a9-41c8-43ce-b4d6-accaaea69afb","Type":"ContainerStarted","Data":"3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8"} Oct 10 06:54:46 crc kubenswrapper[4732]: I1010 06:54:46.818404 4732 generic.go:334] "Generic (PLEG): container finished" podID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerID="c2b3b14f3a3afc7079d6f9b87cfda4f80af783de38fd09a02c5b9d4fb2f3b13c" exitCode=0 Oct 10 06:54:46 crc kubenswrapper[4732]: I1010 06:54:46.818470 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z627g" event={"ID":"4956619d-c22f-4cd0-983b-70aeb971dde7","Type":"ContainerDied","Data":"c2b3b14f3a3afc7079d6f9b87cfda4f80af783de38fd09a02c5b9d4fb2f3b13c"} Oct 10 06:54:46 crc kubenswrapper[4732]: I1010 06:54:46.822550 4732 generic.go:334] "Generic (PLEG): container finished" podID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerID="960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b" exitCode=0 Oct 10 06:54:46 crc kubenswrapper[4732]: I1010 06:54:46.822641 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ft97" event={"ID":"02bfa9a7-ce97-4ec1-a20a-171d59d53f08","Type":"ContainerDied","Data":"960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b"} Oct 10 06:54:46 crc kubenswrapper[4732]: I1010 06:54:46.827736 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdln" event={"ID":"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d","Type":"ContainerStarted","Data":"279968cbb05cb93687a6a5943eab5c0883e2301eac01266f6c136bf4ce126fee"} Oct 10 06:54:46 crc kubenswrapper[4732]: I1010 06:54:46.835833 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zzmww" podStartSLOduration=4.131098759 podStartE2EDuration="54.835808682s" podCreationTimestamp="2025-10-10 06:53:52 +0000 UTC" firstStartedPulling="2025-10-10 06:53:54.246153588 +0000 UTC m=+161.315744829" lastFinishedPulling="2025-10-10 06:54:44.950863501 +0000 UTC m=+212.020454752" observedRunningTime="2025-10-10 06:54:45.827327857 +0000 UTC m=+212.896919118" watchObservedRunningTime="2025-10-10 06:54:46.835808682 +0000 UTC m=+213.905399923" Oct 10 06:54:47 crc kubenswrapper[4732]: I1010 06:54:47.835077 4732 generic.go:334] "Generic (PLEG): container finished" podID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerID="279968cbb05cb93687a6a5943eab5c0883e2301eac01266f6c136bf4ce126fee" exitCode=0 Oct 10 06:54:47 crc kubenswrapper[4732]: I1010 06:54:47.835166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdln" event={"ID":"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d","Type":"ContainerDied","Data":"279968cbb05cb93687a6a5943eab5c0883e2301eac01266f6c136bf4ce126fee"} Oct 10 06:54:47 crc kubenswrapper[4732]: I1010 06:54:47.839084 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z627g" event={"ID":"4956619d-c22f-4cd0-983b-70aeb971dde7","Type":"ContainerStarted","Data":"4b97172138f666b1f2a31feb9bf5602fd5ccb6c115c31743941905064f7181da"} Oct 10 06:54:47 crc kubenswrapper[4732]: I1010 06:54:47.843584 4732 generic.go:334] "Generic (PLEG): container finished" podID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerID="3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8" exitCode=0 Oct 10 06:54:47 crc kubenswrapper[4732]: I1010 06:54:47.843626 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgtsw" event={"ID":"51d8b2a9-41c8-43ce-b4d6-accaaea69afb","Type":"ContainerDied","Data":"3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8"} Oct 10 06:54:47 crc kubenswrapper[4732]: I1010 06:54:47.903050 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z627g" podStartSLOduration=2.794685312 podStartE2EDuration="53.903027495s" podCreationTimestamp="2025-10-10 06:53:54 +0000 UTC" firstStartedPulling="2025-10-10 06:53:56.369576707 +0000 UTC m=+163.439167948" lastFinishedPulling="2025-10-10 06:54:47.47791889 +0000 UTC m=+214.547510131" observedRunningTime="2025-10-10 06:54:47.901444553 +0000 UTC m=+214.971035804" watchObservedRunningTime="2025-10-10 06:54:47.903027495 +0000 UTC m=+214.972618736" Oct 10 06:54:48 crc kubenswrapper[4732]: I1010 06:54:48.849614 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ft97" event={"ID":"02bfa9a7-ce97-4ec1-a20a-171d59d53f08","Type":"ContainerStarted","Data":"602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904"} Oct 10 06:54:48 crc kubenswrapper[4732]: I1010 06:54:48.868956 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8ft97" podStartSLOduration=4.533532787 podStartE2EDuration="54.868939956s" podCreationTimestamp="2025-10-10 06:53:54 +0000 UTC" firstStartedPulling="2025-10-10 06:53:57.411065679 +0000 UTC m=+164.480656920" lastFinishedPulling="2025-10-10 06:54:47.746472848 +0000 UTC m=+214.816064089" observedRunningTime="2025-10-10 06:54:48.866021518 +0000 UTC m=+215.935612779" watchObservedRunningTime="2025-10-10 06:54:48.868939956 +0000 UTC m=+215.938531197" Oct 10 06:54:49 crc kubenswrapper[4732]: I1010 06:54:49.856370 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdln" event={"ID":"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d","Type":"ContainerStarted","Data":"788e2db5f54f91da79e522b6557954ec4c924bc98ba6c71e093c63947d9bf836"} Oct 10 06:54:49 crc kubenswrapper[4732]: I1010 06:54:49.860475 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgtsw" event={"ID":"51d8b2a9-41c8-43ce-b4d6-accaaea69afb","Type":"ContainerStarted","Data":"2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302"} Oct 10 06:54:49 crc kubenswrapper[4732]: I1010 06:54:49.892110 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gtdln" podStartSLOduration=1.7321627130000001 podStartE2EDuration="57.892089784s" podCreationTimestamp="2025-10-10 06:53:52 +0000 UTC" firstStartedPulling="2025-10-10 06:53:53.226683607 +0000 UTC m=+160.296274848" lastFinishedPulling="2025-10-10 06:54:49.386610668 +0000 UTC m=+216.456201919" observedRunningTime="2025-10-10 06:54:49.889456163 +0000 UTC m=+216.959047414" watchObservedRunningTime="2025-10-10 06:54:49.892089784 +0000 UTC m=+216.961681035" Oct 10 06:54:49 crc kubenswrapper[4732]: I1010 06:54:49.907495 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hgtsw" podStartSLOduration=3.463709433 podStartE2EDuration="57.907474427s" podCreationTimestamp="2025-10-10 06:53:52 +0000 UTC" firstStartedPulling="2025-10-10 06:53:54.241872895 +0000 UTC m=+161.311464136" lastFinishedPulling="2025-10-10 06:54:48.685637889 +0000 UTC m=+215.755229130" observedRunningTime="2025-10-10 06:54:49.905755161 +0000 UTC m=+216.975346422" watchObservedRunningTime="2025-10-10 06:54:49.907474427 +0000 UTC m=+216.977065668" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.540341 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.540959 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.599834 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.830242 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.830586 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.870397 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.922089 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.958462 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.958799 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:54:52 crc kubenswrapper[4732]: I1010 06:54:52.993943 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:54:53 crc kubenswrapper[4732]: I1010 06:54:53.925248 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:54:54 crc kubenswrapper[4732]: I1010 06:54:54.624119 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hgtsw"] Oct 10 06:54:54 crc kubenswrapper[4732]: I1010 06:54:54.766740 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:54:54 crc kubenswrapper[4732]: I1010 06:54:54.766789 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:54:54 crc kubenswrapper[4732]: I1010 06:54:54.807803 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:54:54 crc kubenswrapper[4732]: I1010 06:54:54.926726 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.168561 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.168620 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.209851 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.355600 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.355668 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.355741 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.356366 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.356471 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59" gracePeriod=600 Oct 10 06:54:55 crc kubenswrapper[4732]: E1010 06:54:55.438017 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca39c55_1a82_41b2_b7d5_925320a4e8a0.slice/crio-3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59.scope\": RecentStats: unable to find data in memory cache]" Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.893717 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59" exitCode=0 Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.893802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59"} Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.894409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"92d7821c55f776aa0ba046ed761a87a67e7ac50878e724a90c5ab07c8b2d6230"} Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.894454 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hgtsw" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="registry-server" containerID="cri-o://2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302" gracePeriod=2 Oct 10 06:54:55 crc kubenswrapper[4732]: I1010 06:54:55.944254 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.265152 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.350390 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-catalog-content\") pod \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.350454 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8rb7\" (UniqueName: \"kubernetes.io/projected/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-kube-api-access-p8rb7\") pod \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.350523 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-utilities\") pod \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\" (UID: \"51d8b2a9-41c8-43ce-b4d6-accaaea69afb\") " Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.351388 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-utilities" (OuterVolumeSpecName: "utilities") pod "51d8b2a9-41c8-43ce-b4d6-accaaea69afb" (UID: "51d8b2a9-41c8-43ce-b4d6-accaaea69afb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.355137 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-kube-api-access-p8rb7" (OuterVolumeSpecName: "kube-api-access-p8rb7") pod "51d8b2a9-41c8-43ce-b4d6-accaaea69afb" (UID: "51d8b2a9-41c8-43ce-b4d6-accaaea69afb"). InnerVolumeSpecName "kube-api-access-p8rb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.406485 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51d8b2a9-41c8-43ce-b4d6-accaaea69afb" (UID: "51d8b2a9-41c8-43ce-b4d6-accaaea69afb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.451767 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.451808 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8rb7\" (UniqueName: \"kubernetes.io/projected/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-kube-api-access-p8rb7\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.451824 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51d8b2a9-41c8-43ce-b4d6-accaaea69afb-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.904455 4732 generic.go:334] "Generic (PLEG): container finished" podID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerID="2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302" exitCode=0 Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.904517 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hgtsw" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.904572 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgtsw" event={"ID":"51d8b2a9-41c8-43ce-b4d6-accaaea69afb","Type":"ContainerDied","Data":"2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302"} Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.904656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hgtsw" event={"ID":"51d8b2a9-41c8-43ce-b4d6-accaaea69afb","Type":"ContainerDied","Data":"23d146a92138ecf4793be1966892d1bc19d85be1717ef1858c9bb7134bcbe380"} Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.904677 4732 scope.go:117] "RemoveContainer" containerID="2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.927733 4732 scope.go:117] "RemoveContainer" containerID="3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.933131 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hgtsw"] Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.938844 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hgtsw"] Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.952890 4732 scope.go:117] "RemoveContainer" containerID="3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.970344 4732 scope.go:117] "RemoveContainer" containerID="2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302" Oct 10 06:54:56 crc kubenswrapper[4732]: E1010 06:54:56.970955 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302\": container with ID starting with 2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302 not found: ID does not exist" containerID="2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.970996 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302"} err="failed to get container status \"2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302\": rpc error: code = NotFound desc = could not find container \"2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302\": container with ID starting with 2ada7363998cc3f692ce9a003d98325f7ff5093c65780768181bcde2d2c6e302 not found: ID does not exist" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.971038 4732 scope.go:117] "RemoveContainer" containerID="3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8" Oct 10 06:54:56 crc kubenswrapper[4732]: E1010 06:54:56.971362 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8\": container with ID starting with 3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8 not found: ID does not exist" containerID="3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.971409 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8"} err="failed to get container status \"3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8\": rpc error: code = NotFound desc = could not find container \"3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8\": container with ID starting with 3b8b29acdec7acc0a876a4e48613cec2d6a9836397b39f8d6091d801203944b8 not found: ID does not exist" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.971436 4732 scope.go:117] "RemoveContainer" containerID="3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805" Oct 10 06:54:56 crc kubenswrapper[4732]: E1010 06:54:56.971796 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805\": container with ID starting with 3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805 not found: ID does not exist" containerID="3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805" Oct 10 06:54:56 crc kubenswrapper[4732]: I1010 06:54:56.971833 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805"} err="failed to get container status \"3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805\": rpc error: code = NotFound desc = could not find container \"3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805\": container with ID starting with 3b9502eb3490b84b711ebc9c38747ab19f559216ed8e087a62300b3f4136c805 not found: ID does not exist" Oct 10 06:54:57 crc kubenswrapper[4732]: I1010 06:54:57.671507 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" path="/var/lib/kubelet/pods/51d8b2a9-41c8-43ce-b4d6-accaaea69afb/volumes" Oct 10 06:54:58 crc kubenswrapper[4732]: I1010 06:54:58.826625 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ft97"] Oct 10 06:54:58 crc kubenswrapper[4732]: I1010 06:54:58.827208 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8ft97" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="registry-server" containerID="cri-o://602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904" gracePeriod=2 Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.918377 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.919600 4732 generic.go:334] "Generic (PLEG): container finished" podID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerID="602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904" exitCode=0 Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.919645 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ft97" event={"ID":"02bfa9a7-ce97-4ec1-a20a-171d59d53f08","Type":"ContainerDied","Data":"602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904"} Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.919671 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ft97" event={"ID":"02bfa9a7-ce97-4ec1-a20a-171d59d53f08","Type":"ContainerDied","Data":"69e86c390e28e69c4dbf289085ea041306248dfb6cf54b492921c35aa2036934"} Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.919706 4732 scope.go:117] "RemoveContainer" containerID="602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.935285 4732 scope.go:117] "RemoveContainer" containerID="960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.958653 4732 scope.go:117] "RemoveContainer" containerID="2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.971329 4732 scope.go:117] "RemoveContainer" containerID="602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904" Oct 10 06:54:59 crc kubenswrapper[4732]: E1010 06:54:59.971858 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904\": container with ID starting with 602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904 not found: ID does not exist" containerID="602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.971895 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904"} err="failed to get container status \"602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904\": rpc error: code = NotFound desc = could not find container \"602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904\": container with ID starting with 602ea3f5bf35aa3057c7fdf7d2663488ba820e17de3dca8022cf83cdd4835904 not found: ID does not exist" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.971948 4732 scope.go:117] "RemoveContainer" containerID="960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b" Oct 10 06:54:59 crc kubenswrapper[4732]: E1010 06:54:59.972351 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b\": container with ID starting with 960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b not found: ID does not exist" containerID="960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.972403 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b"} err="failed to get container status \"960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b\": rpc error: code = NotFound desc = could not find container \"960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b\": container with ID starting with 960abd66e4464f71d00cb45715f22d06bb54f427a996106a21203cdb17319c4b not found: ID does not exist" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.972435 4732 scope.go:117] "RemoveContainer" containerID="2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec" Oct 10 06:54:59 crc kubenswrapper[4732]: E1010 06:54:59.972833 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec\": container with ID starting with 2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec not found: ID does not exist" containerID="2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.972881 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec"} err="failed to get container status \"2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec\": rpc error: code = NotFound desc = could not find container \"2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec\": container with ID starting with 2920b66b22e2c401413ef298b210910814cd6ccbaa43d5f0c5ab26dbb62dfaec not found: ID does not exist" Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.994352 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-catalog-content\") pod \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.994434 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h777j\" (UniqueName: \"kubernetes.io/projected/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-kube-api-access-h777j\") pod \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.994507 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-utilities\") pod \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\" (UID: \"02bfa9a7-ce97-4ec1-a20a-171d59d53f08\") " Oct 10 06:54:59 crc kubenswrapper[4732]: I1010 06:54:59.995438 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-utilities" (OuterVolumeSpecName: "utilities") pod "02bfa9a7-ce97-4ec1-a20a-171d59d53f08" (UID: "02bfa9a7-ce97-4ec1-a20a-171d59d53f08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.000118 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-kube-api-access-h777j" (OuterVolumeSpecName: "kube-api-access-h777j") pod "02bfa9a7-ce97-4ec1-a20a-171d59d53f08" (UID: "02bfa9a7-ce97-4ec1-a20a-171d59d53f08"). InnerVolumeSpecName "kube-api-access-h777j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.011317 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02bfa9a7-ce97-4ec1-a20a-171d59d53f08" (UID: "02bfa9a7-ce97-4ec1-a20a-171d59d53f08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.096019 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h777j\" (UniqueName: \"kubernetes.io/projected/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-kube-api-access-h777j\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.096060 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.096072 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02bfa9a7-ce97-4ec1-a20a-171d59d53f08-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.925319 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ft97" Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.949945 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ft97"] Oct 10 06:55:00 crc kubenswrapper[4732]: I1010 06:55:00.957389 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ft97"] Oct 10 06:55:01 crc kubenswrapper[4732]: I1010 06:55:01.667835 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" path="/var/lib/kubelet/pods/02bfa9a7-ce97-4ec1-a20a-171d59d53f08/volumes" Oct 10 06:55:02 crc kubenswrapper[4732]: I1010 06:55:02.577285 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:55:04 crc kubenswrapper[4732]: I1010 06:55:04.493147 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8qd8h"] Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.518159 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" podUID="524083e6-c56c-4c74-b700-ac668cb2022c" containerName="oauth-openshift" containerID="cri-o://a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d" gracePeriod=15 Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.834186 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.874865 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-86d854dc6b-4nkww"] Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875057 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875069 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875078 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="786942b2-53ca-4df8-8d52-0324504dbd3c" containerName="pruner" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875084 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="786942b2-53ca-4df8-8d52-0324504dbd3c" containerName="pruner" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875093 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875099 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875107 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875112 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875118 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875124 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875131 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875137 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875144 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875150 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875158 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875165 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875175 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875181 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875189 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f9024d-5fe4-4c0d-9fe4-7544a948f085" containerName="pruner" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875195 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f9024d-5fe4-4c0d-9fe4-7544a948f085" containerName="pruner" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875204 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875210 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875217 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875222 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="extract-content" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875229 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875235 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875245 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="524083e6-c56c-4c74-b700-ac668cb2022c" containerName="oauth-openshift" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875252 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="524083e6-c56c-4c74-b700-ac668cb2022c" containerName="oauth-openshift" Oct 10 06:55:29 crc kubenswrapper[4732]: E1010 06:55:29.875260 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875266 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="extract-utilities" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875373 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="786942b2-53ca-4df8-8d52-0324504dbd3c" containerName="pruner" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875384 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bfa9a7-ce97-4ec1-a20a-171d59d53f08" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875391 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f9024d-5fe4-4c0d-9fe4-7544a948f085" containerName="pruner" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875398 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="71937edd-921b-491b-96b9-0c48117ae2ce" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875407 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e249575a-4aa6-40df-ab92-1a72d840a00b" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875415 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="524083e6-c56c-4c74-b700-ac668cb2022c" containerName="oauth-openshift" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875423 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d8b2a9-41c8-43ce-b4d6-accaaea69afb" containerName="registry-server" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.875772 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.888542 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86d854dc6b-4nkww"] Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970142 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-ocp-branding-template\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970198 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-error\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970241 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-login\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970263 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-cliconfig\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970303 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-service-ca\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970337 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-provider-selection\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970363 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-serving-cert\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970389 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvgsj\" (UniqueName: \"kubernetes.io/projected/524083e6-c56c-4c74-b700-ac668cb2022c-kube-api-access-wvgsj\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970418 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/524083e6-c56c-4c74-b700-ac668cb2022c-audit-dir\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-idp-0-file-data\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970487 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-audit-policies\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-router-certs\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970583 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-trusted-ca-bundle\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.970607 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-session\") pod \"524083e6-c56c-4c74-b700-ac668cb2022c\" (UID: \"524083e6-c56c-4c74-b700-ac668cb2022c\") " Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.971380 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/524083e6-c56c-4c74-b700-ac668cb2022c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.971761 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.972338 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.973420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.973582 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.979146 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.979876 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/524083e6-c56c-4c74-b700-ac668cb2022c-kube-api-access-wvgsj" (OuterVolumeSpecName: "kube-api-access-wvgsj") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "kube-api-access-wvgsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.983625 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.984539 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.992270 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:29 crc kubenswrapper[4732]: I1010 06:55:29.994155 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.003257 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.006101 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.006351 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "524083e6-c56c-4c74-b700-ac668cb2022c" (UID: "524083e6-c56c-4c74-b700-ac668cb2022c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.065498 4732 generic.go:334] "Generic (PLEG): container finished" podID="524083e6-c56c-4c74-b700-ac668cb2022c" containerID="a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d" exitCode=0 Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.065549 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" event={"ID":"524083e6-c56c-4c74-b700-ac668cb2022c","Type":"ContainerDied","Data":"a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d"} Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.065577 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" event={"ID":"524083e6-c56c-4c74-b700-ac668cb2022c","Type":"ContainerDied","Data":"bf80d81a548701aee45554b4063f73093161747322aa85ae4c118c42457d69e0"} Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.065604 4732 scope.go:117] "RemoveContainer" containerID="a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.065667 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8qd8h" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074341 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-service-ca\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dfc5b382-2828-4217-8a5c-2d2b06de8b12-audit-dir\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074400 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074419 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-audit-policies\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074437 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074470 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074488 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-login\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074507 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-router-certs\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074524 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074540 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074559 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-error\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074577 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-session\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074594 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxkr4\" (UniqueName: \"kubernetes.io/projected/dfc5b382-2828-4217-8a5c-2d2b06de8b12-kube-api-access-nxkr4\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074614 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074940 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074976 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074986 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.074996 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075005 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075015 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075025 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075033 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvgsj\" (UniqueName: \"kubernetes.io/projected/524083e6-c56c-4c74-b700-ac668cb2022c-kube-api-access-wvgsj\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075044 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/524083e6-c56c-4c74-b700-ac668cb2022c-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075053 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075062 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075070 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075079 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.075088 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/524083e6-c56c-4c74-b700-ac668cb2022c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.087980 4732 scope.go:117] "RemoveContainer" containerID="a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d" Oct 10 06:55:30 crc kubenswrapper[4732]: E1010 06:55:30.088545 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d\": container with ID starting with a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d not found: ID does not exist" containerID="a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.088582 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d"} err="failed to get container status \"a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d\": rpc error: code = NotFound desc = could not find container \"a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d\": container with ID starting with a12fa1c7b5a1fc0e49727b0925f58c8470b9ae2c82274db35d30d7f02c0e4c8d not found: ID does not exist" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.114491 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8qd8h"] Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.117018 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8qd8h"] Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175861 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-login\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175886 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-router-certs\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175903 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175923 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-error\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175966 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-session\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.175985 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxkr4\" (UniqueName: \"kubernetes.io/projected/dfc5b382-2828-4217-8a5c-2d2b06de8b12-kube-api-access-nxkr4\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176004 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176031 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-service-ca\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176056 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dfc5b382-2828-4217-8a5c-2d2b06de8b12-audit-dir\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176086 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176106 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-audit-policies\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176125 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176340 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dfc5b382-2828-4217-8a5c-2d2b06de8b12-audit-dir\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.176782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-cliconfig\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.177582 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-audit-policies\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.177590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-service-ca\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.177626 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.181352 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.181375 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.181441 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-login\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.181857 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-session\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.182227 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-user-template-error\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.182595 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-serving-cert\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.183443 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.183963 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dfc5b382-2828-4217-8a5c-2d2b06de8b12-v4-0-config-system-router-certs\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.197508 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxkr4\" (UniqueName: \"kubernetes.io/projected/dfc5b382-2828-4217-8a5c-2d2b06de8b12-kube-api-access-nxkr4\") pod \"oauth-openshift-86d854dc6b-4nkww\" (UID: \"dfc5b382-2828-4217-8a5c-2d2b06de8b12\") " pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.202409 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:30 crc kubenswrapper[4732]: I1010 06:55:30.374505 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-86d854dc6b-4nkww"] Oct 10 06:55:31 crc kubenswrapper[4732]: I1010 06:55:31.080772 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" event={"ID":"dfc5b382-2828-4217-8a5c-2d2b06de8b12","Type":"ContainerStarted","Data":"c7f63943475f286e740ee641f060fe624a34fc5e4922e89366b7c5a96c7deff9"} Oct 10 06:55:31 crc kubenswrapper[4732]: I1010 06:55:31.080892 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:31 crc kubenswrapper[4732]: I1010 06:55:31.080917 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" event={"ID":"dfc5b382-2828-4217-8a5c-2d2b06de8b12","Type":"ContainerStarted","Data":"f36e8f19de0a2c814003abf633c2fffe25f9ae8401731b98f127bcdea55c6db0"} Oct 10 06:55:31 crc kubenswrapper[4732]: I1010 06:55:31.088377 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" Oct 10 06:55:31 crc kubenswrapper[4732]: I1010 06:55:31.129928 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-86d854dc6b-4nkww" podStartSLOduration=27.129906234 podStartE2EDuration="27.129906234s" podCreationTimestamp="2025-10-10 06:55:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:55:31.10817496 +0000 UTC m=+258.177766221" watchObservedRunningTime="2025-10-10 06:55:31.129906234 +0000 UTC m=+258.199497475" Oct 10 06:55:31 crc kubenswrapper[4732]: I1010 06:55:31.669150 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524083e6-c56c-4c74-b700-ac668cb2022c" path="/var/lib/kubelet/pods/524083e6-c56c-4c74-b700-ac668cb2022c/volumes" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.670378 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtdln"] Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.671074 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gtdln" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="registry-server" containerID="cri-o://788e2db5f54f91da79e522b6557954ec4c924bc98ba6c71e093c63947d9bf836" gracePeriod=30 Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.691615 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzmww"] Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.691915 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zzmww" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="registry-server" containerID="cri-o://737e5a864b012a74d196b97f75f1f721a0832c726396659f0771984f7a10572b" gracePeriod=30 Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.713048 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g2xxs"] Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.713414 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" podUID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" containerName="marketplace-operator" containerID="cri-o://4164ebb704d4c8bc353411a720e1d685a7e31d63d2f35d3337d14ca5170fa6ec" gracePeriod=30 Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.727926 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-54zqg"] Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.729908 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.735314 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z627g"] Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.735548 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z627g" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="registry-server" containerID="cri-o://4b97172138f666b1f2a31feb9bf5602fd5ccb6c115c31743941905064f7181da" gracePeriod=30 Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.740438 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5m96q"] Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.740665 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5m96q" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="registry-server" containerID="cri-o://1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0" gracePeriod=30 Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.745968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-54zqg"] Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.780430 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-kube-api-access-fpt2n\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.780485 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.780542 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: E1010 06:55:45.816667 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0 is running failed: container process not found" containerID="1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0" cmd=["grpc_health_probe","-addr=:50051"] Oct 10 06:55:45 crc kubenswrapper[4732]: E1010 06:55:45.817392 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0 is running failed: container process not found" containerID="1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0" cmd=["grpc_health_probe","-addr=:50051"] Oct 10 06:55:45 crc kubenswrapper[4732]: E1010 06:55:45.818255 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0 is running failed: container process not found" containerID="1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0" cmd=["grpc_health_probe","-addr=:50051"] Oct 10 06:55:45 crc kubenswrapper[4732]: E1010 06:55:45.818289 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-5m96q" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="registry-server" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.881601 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.881687 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-kube-api-access-fpt2n\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.881741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.883347 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.891018 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:45 crc kubenswrapper[4732]: I1010 06:55:45.898234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpt2n\" (UniqueName: \"kubernetes.io/projected/40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f-kube-api-access-fpt2n\") pod \"marketplace-operator-79b997595-54zqg\" (UID: \"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f\") " pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:46 crc kubenswrapper[4732]: E1010 06:55:46.058078 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebb0d91_a16e_4af3_ac63_8c1142e6bfac.slice/crio-4164ebb704d4c8bc353411a720e1d685a7e31d63d2f35d3337d14ca5170fa6ec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebb0d91_a16e_4af3_ac63_8c1142e6bfac.slice/crio-conmon-4164ebb704d4c8bc353411a720e1d685a7e31d63d2f35d3337d14ca5170fa6ec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c8aeb7a_213a_4677_9877_69a57de9d13a.slice/crio-conmon-737e5a864b012a74d196b97f75f1f721a0832c726396659f0771984f7a10572b.scope\": RecentStats: unable to find data in memory cache]" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.166055 4732 generic.go:334] "Generic (PLEG): container finished" podID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" containerID="4164ebb704d4c8bc353411a720e1d685a7e31d63d2f35d3337d14ca5170fa6ec" exitCode=0 Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.166752 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" event={"ID":"bebb0d91-a16e-4af3-ac63-8c1142e6bfac","Type":"ContainerDied","Data":"4164ebb704d4c8bc353411a720e1d685a7e31d63d2f35d3337d14ca5170fa6ec"} Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.172681 4732 generic.go:334] "Generic (PLEG): container finished" podID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerID="788e2db5f54f91da79e522b6557954ec4c924bc98ba6c71e093c63947d9bf836" exitCode=0 Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.172770 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdln" event={"ID":"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d","Type":"ContainerDied","Data":"788e2db5f54f91da79e522b6557954ec4c924bc98ba6c71e093c63947d9bf836"} Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.172803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtdln" event={"ID":"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d","Type":"ContainerDied","Data":"405e10c6baf4dca96285a5406764bee0a2fb9810d55da2b25d57f90c84e49d8f"} Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.172815 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405e10c6baf4dca96285a5406764bee0a2fb9810d55da2b25d57f90c84e49d8f" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.179254 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z627g" event={"ID":"4956619d-c22f-4cd0-983b-70aeb971dde7","Type":"ContainerDied","Data":"4b97172138f666b1f2a31feb9bf5602fd5ccb6c115c31743941905064f7181da"} Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.179298 4732 generic.go:334] "Generic (PLEG): container finished" podID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerID="4b97172138f666b1f2a31feb9bf5602fd5ccb6c115c31743941905064f7181da" exitCode=0 Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.184790 4732 generic.go:334] "Generic (PLEG): container finished" podID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerID="1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0" exitCode=0 Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.184849 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5m96q" event={"ID":"5cfde542-d31f-4d68-a738-a2fcdbddfbeb","Type":"ContainerDied","Data":"1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0"} Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.189333 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.193445 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.193747 4732 generic.go:334] "Generic (PLEG): container finished" podID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerID="737e5a864b012a74d196b97f75f1f721a0832c726396659f0771984f7a10572b" exitCode=0 Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.193905 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzmww" event={"ID":"7c8aeb7a-213a-4677-9877-69a57de9d13a","Type":"ContainerDied","Data":"737e5a864b012a74d196b97f75f1f721a0832c726396659f0771984f7a10572b"} Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.194130 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zzmww" event={"ID":"7c8aeb7a-213a-4677-9877-69a57de9d13a","Type":"ContainerDied","Data":"c9b0b24c54bafcc307e4e41fab91c4e84a0292ca5e9ad8631d5d4f3e32fb9236"} Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.194255 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b0b24c54bafcc307e4e41fab91c4e84a0292ca5e9ad8631d5d4f3e32fb9236" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.199787 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.237908 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.250832 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.258607 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.292049 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c867t\" (UniqueName: \"kubernetes.io/projected/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-kube-api-access-c867t\") pod \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.293845 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brq5m\" (UniqueName: \"kubernetes.io/projected/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-kube-api-access-brq5m\") pod \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.293989 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-utilities\") pod \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.294597 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-catalog-content\") pod \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.294786 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-catalog-content\") pod \"4956619d-c22f-4cd0-983b-70aeb971dde7\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.294925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-catalog-content\") pod \"7c8aeb7a-213a-4677-9877-69a57de9d13a\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.295590 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-874j8\" (UniqueName: \"kubernetes.io/projected/7c8aeb7a-213a-4677-9877-69a57de9d13a-kube-api-access-874j8\") pod \"7c8aeb7a-213a-4677-9877-69a57de9d13a\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.295938 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-utilities" (OuterVolumeSpecName: "utilities") pod "5cfde542-d31f-4d68-a738-a2fcdbddfbeb" (UID: "5cfde542-d31f-4d68-a738-a2fcdbddfbeb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.296219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-operator-metrics\") pod \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.296340 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-catalog-content\") pod \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.301846 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-utilities\") pod \"4956619d-c22f-4cd0-983b-70aeb971dde7\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.302107 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5pgx\" (UniqueName: \"kubernetes.io/projected/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-kube-api-access-h5pgx\") pod \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\" (UID: \"5cfde542-d31f-4d68-a738-a2fcdbddfbeb\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.303976 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-utilities\") pod \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\" (UID: \"9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.304100 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-trusted-ca\") pod \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\" (UID: \"bebb0d91-a16e-4af3-ac63-8c1142e6bfac\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.304175 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-utilities\") pod \"7c8aeb7a-213a-4677-9877-69a57de9d13a\" (UID: \"7c8aeb7a-213a-4677-9877-69a57de9d13a\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.304249 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kggmn\" (UniqueName: \"kubernetes.io/projected/4956619d-c22f-4cd0-983b-70aeb971dde7-kube-api-access-kggmn\") pod \"4956619d-c22f-4cd0-983b-70aeb971dde7\" (UID: \"4956619d-c22f-4cd0-983b-70aeb971dde7\") " Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.304682 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.299310 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-kube-api-access-c867t" (OuterVolumeSpecName: "kube-api-access-c867t") pod "bebb0d91-a16e-4af3-ac63-8c1142e6bfac" (UID: "bebb0d91-a16e-4af3-ac63-8c1142e6bfac"). InnerVolumeSpecName "kube-api-access-c867t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.301653 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-kube-api-access-brq5m" (OuterVolumeSpecName: "kube-api-access-brq5m") pod "9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" (UID: "9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d"). InnerVolumeSpecName "kube-api-access-brq5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.304254 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-utilities" (OuterVolumeSpecName: "utilities") pod "4956619d-c22f-4cd0-983b-70aeb971dde7" (UID: "4956619d-c22f-4cd0-983b-70aeb971dde7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.305036 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-utilities" (OuterVolumeSpecName: "utilities") pod "9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" (UID: "9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.305654 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "bebb0d91-a16e-4af3-ac63-8c1142e6bfac" (UID: "bebb0d91-a16e-4af3-ac63-8c1142e6bfac"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.305758 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-kube-api-access-h5pgx" (OuterVolumeSpecName: "kube-api-access-h5pgx") pod "5cfde542-d31f-4d68-a738-a2fcdbddfbeb" (UID: "5cfde542-d31f-4d68-a738-a2fcdbddfbeb"). InnerVolumeSpecName "kube-api-access-h5pgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.305760 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-utilities" (OuterVolumeSpecName: "utilities") pod "7c8aeb7a-213a-4677-9877-69a57de9d13a" (UID: "7c8aeb7a-213a-4677-9877-69a57de9d13a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.307563 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c8aeb7a-213a-4677-9877-69a57de9d13a-kube-api-access-874j8" (OuterVolumeSpecName: "kube-api-access-874j8") pod "7c8aeb7a-213a-4677-9877-69a57de9d13a" (UID: "7c8aeb7a-213a-4677-9877-69a57de9d13a"). InnerVolumeSpecName "kube-api-access-874j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.308045 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "bebb0d91-a16e-4af3-ac63-8c1142e6bfac" (UID: "bebb0d91-a16e-4af3-ac63-8c1142e6bfac"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.314093 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4956619d-c22f-4cd0-983b-70aeb971dde7-kube-api-access-kggmn" (OuterVolumeSpecName: "kube-api-access-kggmn") pod "4956619d-c22f-4cd0-983b-70aeb971dde7" (UID: "4956619d-c22f-4cd0-983b-70aeb971dde7"). InnerVolumeSpecName "kube-api-access-kggmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.357065 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4956619d-c22f-4cd0-983b-70aeb971dde7" (UID: "4956619d-c22f-4cd0-983b-70aeb971dde7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.372812 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c8aeb7a-213a-4677-9877-69a57de9d13a" (UID: "7c8aeb7a-213a-4677-9877-69a57de9d13a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.372849 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" (UID: "9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405677 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kggmn\" (UniqueName: \"kubernetes.io/projected/4956619d-c22f-4cd0-983b-70aeb971dde7-kube-api-access-kggmn\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405741 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c867t\" (UniqueName: \"kubernetes.io/projected/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-kube-api-access-c867t\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405751 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brq5m\" (UniqueName: \"kubernetes.io/projected/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-kube-api-access-brq5m\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405763 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405774 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405786 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-874j8\" (UniqueName: \"kubernetes.io/projected/7c8aeb7a-213a-4677-9877-69a57de9d13a-kube-api-access-874j8\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405798 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405809 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405820 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4956619d-c22f-4cd0-983b-70aeb971dde7-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405832 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5pgx\" (UniqueName: \"kubernetes.io/projected/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-kube-api-access-h5pgx\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405843 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405853 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bebb0d91-a16e-4af3-ac63-8c1142e6bfac-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.405864 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c8aeb7a-213a-4677-9877-69a57de9d13a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.414606 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cfde542-d31f-4d68-a738-a2fcdbddfbeb" (UID: "5cfde542-d31f-4d68-a738-a2fcdbddfbeb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.491306 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-54zqg"] Oct 10 06:55:46 crc kubenswrapper[4732]: I1010 06:55:46.506502 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cfde542-d31f-4d68-a738-a2fcdbddfbeb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.200949 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z627g" event={"ID":"4956619d-c22f-4cd0-983b-70aeb971dde7","Type":"ContainerDied","Data":"6016a20a483bcac9d5efbced4d01d4dda6099b8619344182a5ffc21c5e280c48"} Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.200993 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z627g" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.201268 4732 scope.go:117] "RemoveContainer" containerID="4b97172138f666b1f2a31feb9bf5602fd5ccb6c115c31743941905064f7181da" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.207578 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5m96q" event={"ID":"5cfde542-d31f-4d68-a738-a2fcdbddfbeb","Type":"ContainerDied","Data":"f75a67fe0c7039082d609e6f95c6e843655d327e0a288bb188733c0dc5b0f841"} Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.207588 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5m96q" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.209974 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.210013 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g2xxs" event={"ID":"bebb0d91-a16e-4af3-ac63-8c1142e6bfac","Type":"ContainerDied","Data":"a6fc517eca334609b04441c4f692e12c4bffa2243e48bda4741d25eb013889dc"} Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.212899 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtdln" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.213823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" event={"ID":"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f","Type":"ContainerStarted","Data":"65db8996a05e166d87d5ca44ca6026b8e3d53e3c200755a6b610f16b4b509276"} Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.213876 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" event={"ID":"40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f","Type":"ContainerStarted","Data":"689e160444a617a8139205bbde41ede4ed2f3203a2a50fd84c8d280e370be589"} Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.215065 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.215383 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zzmww" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.218134 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.222063 4732 scope.go:117] "RemoveContainer" containerID="c2b3b14f3a3afc7079d6f9b87cfda4f80af783de38fd09a02c5b9d4fb2f3b13c" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.237731 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-54zqg" podStartSLOduration=2.237711934 podStartE2EDuration="2.237711934s" podCreationTimestamp="2025-10-10 06:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:55:47.235341637 +0000 UTC m=+274.304932908" watchObservedRunningTime="2025-10-10 06:55:47.237711934 +0000 UTC m=+274.307303185" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.239783 4732 scope.go:117] "RemoveContainer" containerID="9417bea2d553de16dcf61e2919f47961933374504f250c92b0a9cbe2a27930cc" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.260931 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z627g"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.269516 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z627g"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.272425 4732 scope.go:117] "RemoveContainer" containerID="1a191204ff6410d5ce91bb5ceda76916bfc9d82fd01b98dd3ea2f6ab2f5348c0" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.296568 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5m96q"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.300184 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5m96q"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.301225 4732 scope.go:117] "RemoveContainer" containerID="eb19c5200f3165ca37a733568b35389d0d5f8acc02bb8b53d2a35594ea302535" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.308997 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zzmww"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.312350 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zzmww"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.323331 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g2xxs"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.326618 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g2xxs"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.332079 4732 scope.go:117] "RemoveContainer" containerID="6e46140a2b248830b89cf579bfc53a3861a5004f4ab4b9274b2f8eedce3c61af" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.337260 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtdln"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.346809 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gtdln"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.353678 4732 scope.go:117] "RemoveContainer" containerID="4164ebb704d4c8bc353411a720e1d685a7e31d63d2f35d3337d14ca5170fa6ec" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.670462 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" path="/var/lib/kubelet/pods/4956619d-c22f-4cd0-983b-70aeb971dde7/volumes" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.672745 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" path="/var/lib/kubelet/pods/5cfde542-d31f-4d68-a738-a2fcdbddfbeb/volumes" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.674593 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" path="/var/lib/kubelet/pods/7c8aeb7a-213a-4677-9877-69a57de9d13a/volumes" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.676154 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" path="/var/lib/kubelet/pods/9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d/volumes" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.676941 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" path="/var/lib/kubelet/pods/bebb0d91-a16e-4af3-ac63-8c1142e6bfac/volumes" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.896624 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nxp2x"] Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.896980 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.896999 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897096 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897131 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897749 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897767 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897784 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897792 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897804 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897811 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897819 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897827 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897837 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897845 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897855 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897862 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897877 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897885 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897897 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897904 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897916 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897923 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="extract-content" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897934 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" containerName="marketplace-operator" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897941 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" containerName="marketplace-operator" Oct 10 06:55:47 crc kubenswrapper[4732]: E1010 06:55:47.897950 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.897957 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="extract-utilities" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.898068 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4956619d-c22f-4cd0-983b-70aeb971dde7" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.898079 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c8aeb7a-213a-4677-9877-69a57de9d13a" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.898090 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9613d7c5-01ef-4d3d-8ebe-c2dab4e83b4d" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.898098 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bebb0d91-a16e-4af3-ac63-8c1142e6bfac" containerName="marketplace-operator" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.898106 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfde542-d31f-4d68-a738-a2fcdbddfbeb" containerName="registry-server" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.903099 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nxp2x"] Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.903219 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:47 crc kubenswrapper[4732]: I1010 06:55:47.905781 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.038349 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkgq\" (UniqueName: \"kubernetes.io/projected/892e4d19-13af-44ac-ad0c-709b2200a088-kube-api-access-5pkgq\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.038423 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892e4d19-13af-44ac-ad0c-709b2200a088-catalog-content\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.038466 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892e4d19-13af-44ac-ad0c-709b2200a088-utilities\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.095467 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p868x"] Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.098775 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.103337 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.114884 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p868x"] Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.139818 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9853e548-8347-4b73-b09b-4315d6956d6a-catalog-content\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.139895 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkgq\" (UniqueName: \"kubernetes.io/projected/892e4d19-13af-44ac-ad0c-709b2200a088-kube-api-access-5pkgq\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.139929 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892e4d19-13af-44ac-ad0c-709b2200a088-catalog-content\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.139953 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db59w\" (UniqueName: \"kubernetes.io/projected/9853e548-8347-4b73-b09b-4315d6956d6a-kube-api-access-db59w\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.139974 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892e4d19-13af-44ac-ad0c-709b2200a088-utilities\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.139992 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9853e548-8347-4b73-b09b-4315d6956d6a-utilities\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.140430 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/892e4d19-13af-44ac-ad0c-709b2200a088-catalog-content\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.140648 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/892e4d19-13af-44ac-ad0c-709b2200a088-utilities\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.170213 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkgq\" (UniqueName: \"kubernetes.io/projected/892e4d19-13af-44ac-ad0c-709b2200a088-kube-api-access-5pkgq\") pod \"certified-operators-nxp2x\" (UID: \"892e4d19-13af-44ac-ad0c-709b2200a088\") " pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.215914 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.241851 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db59w\" (UniqueName: \"kubernetes.io/projected/9853e548-8347-4b73-b09b-4315d6956d6a-kube-api-access-db59w\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.243502 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9853e548-8347-4b73-b09b-4315d6956d6a-utilities\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.243602 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9853e548-8347-4b73-b09b-4315d6956d6a-catalog-content\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.244281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9853e548-8347-4b73-b09b-4315d6956d6a-utilities\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.244350 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9853e548-8347-4b73-b09b-4315d6956d6a-catalog-content\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.262480 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db59w\" (UniqueName: \"kubernetes.io/projected/9853e548-8347-4b73-b09b-4315d6956d6a-kube-api-access-db59w\") pod \"redhat-marketplace-p868x\" (UID: \"9853e548-8347-4b73-b09b-4315d6956d6a\") " pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.413585 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.605235 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nxp2x"] Oct 10 06:55:48 crc kubenswrapper[4732]: W1010 06:55:48.612922 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892e4d19_13af_44ac_ad0c_709b2200a088.slice/crio-cd50c2c9b1110acbf5d1fc0a860a671e726c83c928a3b2249f95a94c78da7034 WatchSource:0}: Error finding container cd50c2c9b1110acbf5d1fc0a860a671e726c83c928a3b2249f95a94c78da7034: Status 404 returned error can't find the container with id cd50c2c9b1110acbf5d1fc0a860a671e726c83c928a3b2249f95a94c78da7034 Oct 10 06:55:48 crc kubenswrapper[4732]: I1010 06:55:48.789596 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p868x"] Oct 10 06:55:49 crc kubenswrapper[4732]: I1010 06:55:49.226075 4732 generic.go:334] "Generic (PLEG): container finished" podID="892e4d19-13af-44ac-ad0c-709b2200a088" containerID="ffee8b6e3ba2ca5f4baddb79a94e62d8e745127d52bf976f7f996bbf071a50d1" exitCode=0 Oct 10 06:55:49 crc kubenswrapper[4732]: I1010 06:55:49.226136 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxp2x" event={"ID":"892e4d19-13af-44ac-ad0c-709b2200a088","Type":"ContainerDied","Data":"ffee8b6e3ba2ca5f4baddb79a94e62d8e745127d52bf976f7f996bbf071a50d1"} Oct 10 06:55:49 crc kubenswrapper[4732]: I1010 06:55:49.226190 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxp2x" event={"ID":"892e4d19-13af-44ac-ad0c-709b2200a088","Type":"ContainerStarted","Data":"cd50c2c9b1110acbf5d1fc0a860a671e726c83c928a3b2249f95a94c78da7034"} Oct 10 06:55:49 crc kubenswrapper[4732]: I1010 06:55:49.228787 4732 generic.go:334] "Generic (PLEG): container finished" podID="9853e548-8347-4b73-b09b-4315d6956d6a" containerID="e72ad07e34c3d0358279bb22657dd01fa7cb729487924ea75aa5d29445d027b1" exitCode=0 Oct 10 06:55:49 crc kubenswrapper[4732]: I1010 06:55:49.228912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p868x" event={"ID":"9853e548-8347-4b73-b09b-4315d6956d6a","Type":"ContainerDied","Data":"e72ad07e34c3d0358279bb22657dd01fa7cb729487924ea75aa5d29445d027b1"} Oct 10 06:55:49 crc kubenswrapper[4732]: I1010 06:55:49.228963 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p868x" event={"ID":"9853e548-8347-4b73-b09b-4315d6956d6a","Type":"ContainerStarted","Data":"067db6d45b8f6242fd6afdc0b05c8ad01e332c7f57ce024e79c47acac2e11f6a"} Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.236634 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxp2x" event={"ID":"892e4d19-13af-44ac-ad0c-709b2200a088","Type":"ContainerStarted","Data":"87448fcfb66f92407e4fd5bdc10df16cbad78a7cb57783eb7b54cea373588f34"} Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.241822 4732 generic.go:334] "Generic (PLEG): container finished" podID="9853e548-8347-4b73-b09b-4315d6956d6a" containerID="f1889faac2538405d522d3a91c5f801d0d468ba21adad077324ed94e826b4af1" exitCode=0 Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.241869 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p868x" event={"ID":"9853e548-8347-4b73-b09b-4315d6956d6a","Type":"ContainerDied","Data":"f1889faac2538405d522d3a91c5f801d0d468ba21adad077324ed94e826b4af1"} Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.302606 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6prtm"] Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.303775 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.309080 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.322801 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6prtm"] Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.374553 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b4b052-f1f8-45d4-b837-3acca46f7e39-utilities\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.374657 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b4b052-f1f8-45d4-b837-3acca46f7e39-catalog-content\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.375672 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpnl\" (UniqueName: \"kubernetes.io/projected/e1b4b052-f1f8-45d4-b837-3acca46f7e39-kube-api-access-4tpnl\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.478216 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b4b052-f1f8-45d4-b837-3acca46f7e39-utilities\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.478353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b4b052-f1f8-45d4-b837-3acca46f7e39-catalog-content\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.478398 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpnl\" (UniqueName: \"kubernetes.io/projected/e1b4b052-f1f8-45d4-b837-3acca46f7e39-kube-api-access-4tpnl\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.480175 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1b4b052-f1f8-45d4-b837-3acca46f7e39-utilities\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.480431 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1b4b052-f1f8-45d4-b837-3acca46f7e39-catalog-content\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.498435 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxg85"] Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.499752 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.503645 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.514540 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxg85"] Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.515019 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpnl\" (UniqueName: \"kubernetes.io/projected/e1b4b052-f1f8-45d4-b837-3acca46f7e39-kube-api-access-4tpnl\") pod \"redhat-operators-6prtm\" (UID: \"e1b4b052-f1f8-45d4-b837-3acca46f7e39\") " pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.579781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86kxw\" (UniqueName: \"kubernetes.io/projected/560dd02b-9de9-4489-b95c-b039bcd21e3e-kube-api-access-86kxw\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.579994 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-catalog-content\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.580025 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-utilities\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.626487 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.681442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-catalog-content\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.681490 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-utilities\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.681549 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86kxw\" (UniqueName: \"kubernetes.io/projected/560dd02b-9de9-4489-b95c-b039bcd21e3e-kube-api-access-86kxw\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.682274 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-catalog-content\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.682892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-utilities\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.700827 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86kxw\" (UniqueName: \"kubernetes.io/projected/560dd02b-9de9-4489-b95c-b039bcd21e3e-kube-api-access-86kxw\") pod \"community-operators-sxg85\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:50 crc kubenswrapper[4732]: I1010 06:55:50.828442 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.015961 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxg85"] Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.026241 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6prtm"] Oct 10 06:55:51 crc kubenswrapper[4732]: W1010 06:55:51.034239 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b4b052_f1f8_45d4_b837_3acca46f7e39.slice/crio-30d0ef2ac453fb6ded5b20fa9905e42480c092b2c517203dadfb119a165a8e93 WatchSource:0}: Error finding container 30d0ef2ac453fb6ded5b20fa9905e42480c092b2c517203dadfb119a165a8e93: Status 404 returned error can't find the container with id 30d0ef2ac453fb6ded5b20fa9905e42480c092b2c517203dadfb119a165a8e93 Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.254840 4732 generic.go:334] "Generic (PLEG): container finished" podID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerID="fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e" exitCode=0 Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.254920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxg85" event={"ID":"560dd02b-9de9-4489-b95c-b039bcd21e3e","Type":"ContainerDied","Data":"fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e"} Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.254974 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxg85" event={"ID":"560dd02b-9de9-4489-b95c-b039bcd21e3e","Type":"ContainerStarted","Data":"ced73c561c8593237786ff8c57f84b3557e7f201e14b182defe384481979a994"} Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.258338 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p868x" event={"ID":"9853e548-8347-4b73-b09b-4315d6956d6a","Type":"ContainerStarted","Data":"68f080227281112e65d39070419e116021a2b76104da6d1f6e0abad44de0fa0b"} Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.262873 4732 generic.go:334] "Generic (PLEG): container finished" podID="e1b4b052-f1f8-45d4-b837-3acca46f7e39" containerID="11dfc5aca5dd8fdcc540ac2b6cc94b174d4af89d105ed13aea1bf08e94b5e706" exitCode=0 Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.262939 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6prtm" event={"ID":"e1b4b052-f1f8-45d4-b837-3acca46f7e39","Type":"ContainerDied","Data":"11dfc5aca5dd8fdcc540ac2b6cc94b174d4af89d105ed13aea1bf08e94b5e706"} Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.262959 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6prtm" event={"ID":"e1b4b052-f1f8-45d4-b837-3acca46f7e39","Type":"ContainerStarted","Data":"30d0ef2ac453fb6ded5b20fa9905e42480c092b2c517203dadfb119a165a8e93"} Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.265333 4732 generic.go:334] "Generic (PLEG): container finished" podID="892e4d19-13af-44ac-ad0c-709b2200a088" containerID="87448fcfb66f92407e4fd5bdc10df16cbad78a7cb57783eb7b54cea373588f34" exitCode=0 Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.265370 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxp2x" event={"ID":"892e4d19-13af-44ac-ad0c-709b2200a088","Type":"ContainerDied","Data":"87448fcfb66f92407e4fd5bdc10df16cbad78a7cb57783eb7b54cea373588f34"} Oct 10 06:55:51 crc kubenswrapper[4732]: I1010 06:55:51.294645 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p868x" podStartSLOduration=1.883855775 podStartE2EDuration="3.294625238s" podCreationTimestamp="2025-10-10 06:55:48 +0000 UTC" firstStartedPulling="2025-10-10 06:55:49.230340589 +0000 UTC m=+276.299931830" lastFinishedPulling="2025-10-10 06:55:50.641110052 +0000 UTC m=+277.710701293" observedRunningTime="2025-10-10 06:55:51.292814587 +0000 UTC m=+278.362405838" watchObservedRunningTime="2025-10-10 06:55:51.294625238 +0000 UTC m=+278.364216479" Oct 10 06:55:52 crc kubenswrapper[4732]: I1010 06:55:52.275928 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxg85" event={"ID":"560dd02b-9de9-4489-b95c-b039bcd21e3e","Type":"ContainerStarted","Data":"2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8"} Oct 10 06:55:52 crc kubenswrapper[4732]: I1010 06:55:52.278452 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6prtm" event={"ID":"e1b4b052-f1f8-45d4-b837-3acca46f7e39","Type":"ContainerStarted","Data":"e43782a12d587bbbabc4265e7afe03177fed80308395c59e3c63131fe3afde48"} Oct 10 06:55:52 crc kubenswrapper[4732]: I1010 06:55:52.282144 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nxp2x" event={"ID":"892e4d19-13af-44ac-ad0c-709b2200a088","Type":"ContainerStarted","Data":"4aff789e41611face39142f693b3a9b25894d55cae240f39bcb64b3e545c72b6"} Oct 10 06:55:52 crc kubenswrapper[4732]: I1010 06:55:52.340964 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nxp2x" podStartSLOduration=2.684172144 podStartE2EDuration="5.340946624s" podCreationTimestamp="2025-10-10 06:55:47 +0000 UTC" firstStartedPulling="2025-10-10 06:55:49.227278073 +0000 UTC m=+276.296869314" lastFinishedPulling="2025-10-10 06:55:51.884052553 +0000 UTC m=+278.953643794" observedRunningTime="2025-10-10 06:55:52.322012329 +0000 UTC m=+279.391603570" watchObservedRunningTime="2025-10-10 06:55:52.340946624 +0000 UTC m=+279.410537865" Oct 10 06:55:53 crc kubenswrapper[4732]: I1010 06:55:53.289840 4732 generic.go:334] "Generic (PLEG): container finished" podID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerID="2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8" exitCode=0 Oct 10 06:55:53 crc kubenswrapper[4732]: I1010 06:55:53.289912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxg85" event={"ID":"560dd02b-9de9-4489-b95c-b039bcd21e3e","Type":"ContainerDied","Data":"2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8"} Oct 10 06:55:53 crc kubenswrapper[4732]: I1010 06:55:53.295343 4732 generic.go:334] "Generic (PLEG): container finished" podID="e1b4b052-f1f8-45d4-b837-3acca46f7e39" containerID="e43782a12d587bbbabc4265e7afe03177fed80308395c59e3c63131fe3afde48" exitCode=0 Oct 10 06:55:53 crc kubenswrapper[4732]: I1010 06:55:53.296087 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6prtm" event={"ID":"e1b4b052-f1f8-45d4-b837-3acca46f7e39","Type":"ContainerDied","Data":"e43782a12d587bbbabc4265e7afe03177fed80308395c59e3c63131fe3afde48"} Oct 10 06:55:55 crc kubenswrapper[4732]: I1010 06:55:55.309058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxg85" event={"ID":"560dd02b-9de9-4489-b95c-b039bcd21e3e","Type":"ContainerStarted","Data":"d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136"} Oct 10 06:55:55 crc kubenswrapper[4732]: I1010 06:55:55.311232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6prtm" event={"ID":"e1b4b052-f1f8-45d4-b837-3acca46f7e39","Type":"ContainerStarted","Data":"00851fc3039158e72c97d722c3ebd81dce9f0c1f3e5b2dc83bac91e42d8e0819"} Oct 10 06:55:55 crc kubenswrapper[4732]: I1010 06:55:55.330309 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxg85" podStartSLOduration=2.766668564 podStartE2EDuration="5.330287792s" podCreationTimestamp="2025-10-10 06:55:50 +0000 UTC" firstStartedPulling="2025-10-10 06:55:51.259183987 +0000 UTC m=+278.328775218" lastFinishedPulling="2025-10-10 06:55:53.822803195 +0000 UTC m=+280.892394446" observedRunningTime="2025-10-10 06:55:55.3263415 +0000 UTC m=+282.395932741" watchObservedRunningTime="2025-10-10 06:55:55.330287792 +0000 UTC m=+282.399879033" Oct 10 06:55:55 crc kubenswrapper[4732]: I1010 06:55:55.345492 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6prtm" podStartSLOduration=2.88649843 podStartE2EDuration="5.345473251s" podCreationTimestamp="2025-10-10 06:55:50 +0000 UTC" firstStartedPulling="2025-10-10 06:55:51.264062765 +0000 UTC m=+278.333654006" lastFinishedPulling="2025-10-10 06:55:53.723037586 +0000 UTC m=+280.792628827" observedRunningTime="2025-10-10 06:55:55.34403604 +0000 UTC m=+282.413627291" watchObservedRunningTime="2025-10-10 06:55:55.345473251 +0000 UTC m=+282.415064492" Oct 10 06:55:58 crc kubenswrapper[4732]: I1010 06:55:58.217031 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:58 crc kubenswrapper[4732]: I1010 06:55:58.218871 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:58 crc kubenswrapper[4732]: I1010 06:55:58.258233 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:58 crc kubenswrapper[4732]: I1010 06:55:58.370767 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nxp2x" Oct 10 06:55:58 crc kubenswrapper[4732]: I1010 06:55:58.415584 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:58 crc kubenswrapper[4732]: I1010 06:55:58.415639 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:58 crc kubenswrapper[4732]: I1010 06:55:58.455051 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:55:59 crc kubenswrapper[4732]: I1010 06:55:59.366501 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p868x" Oct 10 06:56:00 crc kubenswrapper[4732]: I1010 06:56:00.626764 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:56:00 crc kubenswrapper[4732]: I1010 06:56:00.627103 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:56:00 crc kubenswrapper[4732]: I1010 06:56:00.665270 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:56:00 crc kubenswrapper[4732]: I1010 06:56:00.828795 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:56:00 crc kubenswrapper[4732]: I1010 06:56:00.829128 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:56:00 crc kubenswrapper[4732]: I1010 06:56:00.869548 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:56:01 crc kubenswrapper[4732]: I1010 06:56:01.385064 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxg85" Oct 10 06:56:01 crc kubenswrapper[4732]: I1010 06:56:01.385457 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6prtm" Oct 10 06:56:55 crc kubenswrapper[4732]: I1010 06:56:55.355842 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:56:55 crc kubenswrapper[4732]: I1010 06:56:55.356983 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:57:25 crc kubenswrapper[4732]: I1010 06:57:25.356045 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:57:25 crc kubenswrapper[4732]: I1010 06:57:25.356599 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:57:55 crc kubenswrapper[4732]: I1010 06:57:55.356022 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:57:55 crc kubenswrapper[4732]: I1010 06:57:55.356668 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 06:57:55 crc kubenswrapper[4732]: I1010 06:57:55.356792 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 06:57:55 crc kubenswrapper[4732]: I1010 06:57:55.357791 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92d7821c55f776aa0ba046ed761a87a67e7ac50878e724a90c5ab07c8b2d6230"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 06:57:55 crc kubenswrapper[4732]: I1010 06:57:55.357899 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://92d7821c55f776aa0ba046ed761a87a67e7ac50878e724a90c5ab07c8b2d6230" gracePeriod=600 Oct 10 06:57:56 crc kubenswrapper[4732]: I1010 06:57:56.028986 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="92d7821c55f776aa0ba046ed761a87a67e7ac50878e724a90c5ab07c8b2d6230" exitCode=0 Oct 10 06:57:56 crc kubenswrapper[4732]: I1010 06:57:56.029040 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"92d7821c55f776aa0ba046ed761a87a67e7ac50878e724a90c5ab07c8b2d6230"} Oct 10 06:57:56 crc kubenswrapper[4732]: I1010 06:57:56.029293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"0f1dcf57554420a877507004f1258c39575e3a24cbe7c5b8ca3aeabfcdb7b710"} Oct 10 06:57:56 crc kubenswrapper[4732]: I1010 06:57:56.029314 4732 scope.go:117] "RemoveContainer" containerID="3699e2dc06f0016d29590b4bd0edc16d4af20280fdc75f736aafe69265da9d59" Oct 10 06:59:46 crc kubenswrapper[4732]: I1010 06:59:46.928759 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5sxj"] Oct 10 06:59:46 crc kubenswrapper[4732]: I1010 06:59:46.929920 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:46 crc kubenswrapper[4732]: I1010 06:59:46.953761 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5sxj"] Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016043 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6ht4\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-kube-api-access-k6ht4\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f28e5c8-88e0-4f5a-851b-925bdf759454-trusted-ca\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-bound-sa-token\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016156 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-registry-tls\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016180 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016256 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f28e5c8-88e0-4f5a-851b-925bdf759454-registry-certificates\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016323 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f28e5c8-88e0-4f5a-851b-925bdf759454-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.016398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f28e5c8-88e0-4f5a-851b-925bdf759454-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.043549 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.117790 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6ht4\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-kube-api-access-k6ht4\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.117850 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f28e5c8-88e0-4f5a-851b-925bdf759454-trusted-ca\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.117877 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-bound-sa-token\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.117896 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-registry-tls\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.117917 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f28e5c8-88e0-4f5a-851b-925bdf759454-registry-certificates\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.117937 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f28e5c8-88e0-4f5a-851b-925bdf759454-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.117959 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f28e5c8-88e0-4f5a-851b-925bdf759454-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.119010 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f28e5c8-88e0-4f5a-851b-925bdf759454-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.119782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f28e5c8-88e0-4f5a-851b-925bdf759454-trusted-ca\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.119835 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f28e5c8-88e0-4f5a-851b-925bdf759454-registry-certificates\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.124680 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f28e5c8-88e0-4f5a-851b-925bdf759454-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.130497 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-registry-tls\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.139589 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6ht4\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-kube-api-access-k6ht4\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.147056 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f28e5c8-88e0-4f5a-851b-925bdf759454-bound-sa-token\") pod \"image-registry-66df7c8f76-p5sxj\" (UID: \"8f28e5c8-88e0-4f5a-851b-925bdf759454\") " pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.248011 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.443795 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p5sxj"] Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.726551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" event={"ID":"8f28e5c8-88e0-4f5a-851b-925bdf759454","Type":"ContainerStarted","Data":"f5bdb07759de1b7c331466a9dc95832abff1743e33cd564f3d8eb8df10724b56"} Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.726655 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" event={"ID":"8f28e5c8-88e0-4f5a-851b-925bdf759454","Type":"ContainerStarted","Data":"7cd19f948fa9e7589ae9f3760e2414434fad4e4ad8113353b34cb3695a90ba1d"} Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.726732 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 06:59:47 crc kubenswrapper[4732]: I1010 06:59:47.746358 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" podStartSLOduration=1.746332914 podStartE2EDuration="1.746332914s" podCreationTimestamp="2025-10-10 06:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 06:59:47.74183073 +0000 UTC m=+514.811422011" watchObservedRunningTime="2025-10-10 06:59:47.746332914 +0000 UTC m=+514.815924185" Oct 10 06:59:55 crc kubenswrapper[4732]: I1010 06:59:55.356170 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 06:59:55 crc kubenswrapper[4732]: I1010 06:59:55.356901 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.150809 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl"] Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.154969 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.158461 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.158878 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl"] Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.158976 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.258578 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75983d33-55cf-4310-853f-d3e2b7fefbe3-secret-volume\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.258682 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcbm\" (UniqueName: \"kubernetes.io/projected/75983d33-55cf-4310-853f-d3e2b7fefbe3-kube-api-access-8gcbm\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.258748 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75983d33-55cf-4310-853f-d3e2b7fefbe3-config-volume\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.360546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcbm\" (UniqueName: \"kubernetes.io/projected/75983d33-55cf-4310-853f-d3e2b7fefbe3-kube-api-access-8gcbm\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.360652 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75983d33-55cf-4310-853f-d3e2b7fefbe3-config-volume\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.360807 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75983d33-55cf-4310-853f-d3e2b7fefbe3-secret-volume\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.362001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75983d33-55cf-4310-853f-d3e2b7fefbe3-config-volume\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.373776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75983d33-55cf-4310-853f-d3e2b7fefbe3-secret-volume\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.379204 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcbm\" (UniqueName: \"kubernetes.io/projected/75983d33-55cf-4310-853f-d3e2b7fefbe3-kube-api-access-8gcbm\") pod \"collect-profiles-29334660-wr6cl\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.481655 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:00 crc kubenswrapper[4732]: I1010 07:00:00.870004 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl"] Oct 10 07:00:01 crc kubenswrapper[4732]: I1010 07:00:01.811075 4732 generic.go:334] "Generic (PLEG): container finished" podID="75983d33-55cf-4310-853f-d3e2b7fefbe3" containerID="034b569f27e9cacb24405c7c443f6d0c37905d2a951ea730a3544339e0ce7a93" exitCode=0 Oct 10 07:00:01 crc kubenswrapper[4732]: I1010 07:00:01.811200 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" event={"ID":"75983d33-55cf-4310-853f-d3e2b7fefbe3","Type":"ContainerDied","Data":"034b569f27e9cacb24405c7c443f6d0c37905d2a951ea730a3544339e0ce7a93"} Oct 10 07:00:01 crc kubenswrapper[4732]: I1010 07:00:01.811435 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" event={"ID":"75983d33-55cf-4310-853f-d3e2b7fefbe3","Type":"ContainerStarted","Data":"26cc71a73fee52cf62db34331b6dff5d744196b3480041701c73d8d85814bdc2"} Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.022608 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.196557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75983d33-55cf-4310-853f-d3e2b7fefbe3-config-volume\") pod \"75983d33-55cf-4310-853f-d3e2b7fefbe3\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.196961 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75983d33-55cf-4310-853f-d3e2b7fefbe3-secret-volume\") pod \"75983d33-55cf-4310-853f-d3e2b7fefbe3\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.196978 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gcbm\" (UniqueName: \"kubernetes.io/projected/75983d33-55cf-4310-853f-d3e2b7fefbe3-kube-api-access-8gcbm\") pod \"75983d33-55cf-4310-853f-d3e2b7fefbe3\" (UID: \"75983d33-55cf-4310-853f-d3e2b7fefbe3\") " Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.197462 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75983d33-55cf-4310-853f-d3e2b7fefbe3-config-volume" (OuterVolumeSpecName: "config-volume") pod "75983d33-55cf-4310-853f-d3e2b7fefbe3" (UID: "75983d33-55cf-4310-853f-d3e2b7fefbe3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.202289 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75983d33-55cf-4310-853f-d3e2b7fefbe3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75983d33-55cf-4310-853f-d3e2b7fefbe3" (UID: "75983d33-55cf-4310-853f-d3e2b7fefbe3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.202836 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75983d33-55cf-4310-853f-d3e2b7fefbe3-kube-api-access-8gcbm" (OuterVolumeSpecName: "kube-api-access-8gcbm") pod "75983d33-55cf-4310-853f-d3e2b7fefbe3" (UID: "75983d33-55cf-4310-853f-d3e2b7fefbe3"). InnerVolumeSpecName "kube-api-access-8gcbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.297835 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75983d33-55cf-4310-853f-d3e2b7fefbe3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.297878 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75983d33-55cf-4310-853f-d3e2b7fefbe3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.297892 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gcbm\" (UniqueName: \"kubernetes.io/projected/75983d33-55cf-4310-853f-d3e2b7fefbe3-kube-api-access-8gcbm\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.824973 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" event={"ID":"75983d33-55cf-4310-853f-d3e2b7fefbe3","Type":"ContainerDied","Data":"26cc71a73fee52cf62db34331b6dff5d744196b3480041701c73d8d85814bdc2"} Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.825051 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26cc71a73fee52cf62db34331b6dff5d744196b3480041701c73d8d85814bdc2" Oct 10 07:00:03 crc kubenswrapper[4732]: I1010 07:00:03.825147 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl" Oct 10 07:00:07 crc kubenswrapper[4732]: I1010 07:00:07.256952 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p5sxj" Oct 10 07:00:07 crc kubenswrapper[4732]: I1010 07:00:07.311550 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8tq5m"] Oct 10 07:00:13 crc kubenswrapper[4732]: I1010 07:00:13.775844 4732 scope.go:117] "RemoveContainer" containerID="1f822d7a5347133d62ee7db592f0d0892b78a9f37aecf9608a703350f0738486" Oct 10 07:00:13 crc kubenswrapper[4732]: I1010 07:00:13.812056 4732 scope.go:117] "RemoveContainer" containerID="7b8c8e45af5ec9f1f11423fa22eb324c32fbd9b5218f56a40f210b51fa21fe5d" Oct 10 07:00:25 crc kubenswrapper[4732]: I1010 07:00:25.356472 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:00:25 crc kubenswrapper[4732]: I1010 07:00:25.357843 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.373687 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" podUID="d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" containerName="registry" containerID="cri-o://c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9" gracePeriod=30 Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.758187 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815197 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzqnq\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-kube-api-access-lzqnq\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815261 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-trusted-ca\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815330 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-certificates\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-installation-pull-secrets\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815379 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-ca-trust-extracted\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815742 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815783 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-bound-sa-token\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.815805 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-tls\") pod \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\" (UID: \"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945\") " Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.816670 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.816722 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.822707 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.825021 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.825454 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.825557 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-kube-api-access-lzqnq" (OuterVolumeSpecName: "kube-api-access-lzqnq") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "kube-api-access-lzqnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.827493 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.853595 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" (UID: "d3d2fadd-7bf3-40b3-9c28-69a63eeb3945"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.917132 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.917424 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.917509 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.917575 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.917682 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.917822 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.917905 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzqnq\" (UniqueName: \"kubernetes.io/projected/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945-kube-api-access-lzqnq\") on node \"crc\" DevicePath \"\"" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.998257 4732 generic.go:334] "Generic (PLEG): container finished" podID="d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" containerID="c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9" exitCode=0 Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.998304 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" event={"ID":"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945","Type":"ContainerDied","Data":"c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9"} Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.998344 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" event={"ID":"d3d2fadd-7bf3-40b3-9c28-69a63eeb3945","Type":"ContainerDied","Data":"b1efda1d0864183d1f6fa3d420e62ef55042a556bbd3af357d69d7b32378a008"} Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.998367 4732 scope.go:117] "RemoveContainer" containerID="c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9" Oct 10 07:00:32 crc kubenswrapper[4732]: I1010 07:00:32.998664 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-8tq5m" Oct 10 07:00:33 crc kubenswrapper[4732]: I1010 07:00:33.023024 4732 scope.go:117] "RemoveContainer" containerID="c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9" Oct 10 07:00:33 crc kubenswrapper[4732]: E1010 07:00:33.024319 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9\": container with ID starting with c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9 not found: ID does not exist" containerID="c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9" Oct 10 07:00:33 crc kubenswrapper[4732]: I1010 07:00:33.024374 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9"} err="failed to get container status \"c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9\": rpc error: code = NotFound desc = could not find container \"c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9\": container with ID starting with c8a8ad1a73a5586a7ce1bb672d4f67dcd842d5ebf1e218168a08bf4b5e32e3c9 not found: ID does not exist" Oct 10 07:00:33 crc kubenswrapper[4732]: I1010 07:00:33.045905 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8tq5m"] Oct 10 07:00:33 crc kubenswrapper[4732]: I1010 07:00:33.046457 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-8tq5m"] Oct 10 07:00:33 crc kubenswrapper[4732]: I1010 07:00:33.670014 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" path="/var/lib/kubelet/pods/d3d2fadd-7bf3-40b3-9c28-69a63eeb3945/volumes" Oct 10 07:00:55 crc kubenswrapper[4732]: I1010 07:00:55.356509 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:00:55 crc kubenswrapper[4732]: I1010 07:00:55.359151 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:00:55 crc kubenswrapper[4732]: I1010 07:00:55.359326 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:00:55 crc kubenswrapper[4732]: I1010 07:00:55.360305 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0f1dcf57554420a877507004f1258c39575e3a24cbe7c5b8ca3aeabfcdb7b710"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:00:55 crc kubenswrapper[4732]: I1010 07:00:55.360446 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://0f1dcf57554420a877507004f1258c39575e3a24cbe7c5b8ca3aeabfcdb7b710" gracePeriod=600 Oct 10 07:00:56 crc kubenswrapper[4732]: I1010 07:00:56.131284 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="0f1dcf57554420a877507004f1258c39575e3a24cbe7c5b8ca3aeabfcdb7b710" exitCode=0 Oct 10 07:00:56 crc kubenswrapper[4732]: I1010 07:00:56.131340 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"0f1dcf57554420a877507004f1258c39575e3a24cbe7c5b8ca3aeabfcdb7b710"} Oct 10 07:00:56 crc kubenswrapper[4732]: I1010 07:00:56.131848 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"53a6572dbd94b6a842e7d21f0e7b30a3d933ba27c7ff2433ef57f5a4d47b6c8d"} Oct 10 07:00:56 crc kubenswrapper[4732]: I1010 07:00:56.131871 4732 scope.go:117] "RemoveContainer" containerID="92d7821c55f776aa0ba046ed761a87a67e7ac50878e724a90c5ab07c8b2d6230" Oct 10 07:01:13 crc kubenswrapper[4732]: I1010 07:01:13.857310 4732 scope.go:117] "RemoveContainer" containerID="737e5a864b012a74d196b97f75f1f721a0832c726396659f0771984f7a10572b" Oct 10 07:01:13 crc kubenswrapper[4732]: I1010 07:01:13.884009 4732 scope.go:117] "RemoveContainer" containerID="279968cbb05cb93687a6a5943eab5c0883e2301eac01266f6c136bf4ce126fee" Oct 10 07:01:13 crc kubenswrapper[4732]: I1010 07:01:13.902911 4732 scope.go:117] "RemoveContainer" containerID="788e2db5f54f91da79e522b6557954ec4c924bc98ba6c71e093c63947d9bf836" Oct 10 07:01:13 crc kubenswrapper[4732]: I1010 07:01:13.932206 4732 scope.go:117] "RemoveContainer" containerID="e7c5a18b2a21d3d93e7b6514a2b41e3119159c277d861dd6a035752668759c5f" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.788749 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qqmtc"] Oct 10 07:02:37 crc kubenswrapper[4732]: E1010 07:02:37.789915 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75983d33-55cf-4310-853f-d3e2b7fefbe3" containerName="collect-profiles" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.789932 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="75983d33-55cf-4310-853f-d3e2b7fefbe3" containerName="collect-profiles" Oct 10 07:02:37 crc kubenswrapper[4732]: E1010 07:02:37.789957 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" containerName="registry" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.789964 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" containerName="registry" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.790083 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d2fadd-7bf3-40b3-9c28-69a63eeb3945" containerName="registry" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.790094 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="75983d33-55cf-4310-853f-d3e2b7fefbe3" containerName="collect-profiles" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.791478 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.793763 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.794187 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.794381 4732 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-hdknk" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.794567 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.802957 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qqmtc"] Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.818057 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbvh2\" (UniqueName: \"kubernetes.io/projected/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-kube-api-access-pbvh2\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.818142 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-node-mnt\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.818186 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-crc-storage\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.919836 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-node-mnt\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.919992 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-crc-storage\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.920134 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-node-mnt\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.920153 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbvh2\" (UniqueName: \"kubernetes.io/projected/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-kube-api-access-pbvh2\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.921264 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-crc-storage\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:37 crc kubenswrapper[4732]: I1010 07:02:37.942145 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbvh2\" (UniqueName: \"kubernetes.io/projected/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-kube-api-access-pbvh2\") pod \"crc-storage-crc-qqmtc\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:38 crc kubenswrapper[4732]: I1010 07:02:38.114885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:38 crc kubenswrapper[4732]: I1010 07:02:38.320993 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qqmtc"] Oct 10 07:02:38 crc kubenswrapper[4732]: I1010 07:02:38.328473 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:02:38 crc kubenswrapper[4732]: I1010 07:02:38.754286 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qqmtc" event={"ID":"928bd8d4-82cc-4c7c-8d05-342e3e4b13db","Type":"ContainerStarted","Data":"43b0d67c5fa08cc5d7da22ca0c651a0b24300943d75af9dc0b728131a33f11f2"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.278119 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kdb2x"] Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.278854 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-controller" containerID="cri-o://8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.279258 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="sbdb" containerID="cri-o://ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.279332 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-node" containerID="cri-o://880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.279326 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.279378 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-acl-logging" containerID="cri-o://401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.279434 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="nbdb" containerID="cri-o://a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.279484 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="northd" containerID="cri-o://a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.324201 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" containerID="cri-o://74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" gracePeriod=30 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.732949 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/3.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.736160 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovn-acl-logging/0.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.738125 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovn-controller/0.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.738835 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.769661 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovnkube-controller/3.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.783258 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovn-acl-logging/0.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.783725 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kdb2x_f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/ovn-controller/0.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784237 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" exitCode=0 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784272 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" exitCode=0 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784283 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" exitCode=0 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784294 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" exitCode=0 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784306 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" exitCode=0 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784315 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" exitCode=0 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784324 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" exitCode=143 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784334 4732 generic.go:334] "Generic (PLEG): container finished" podID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerID="8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" exitCode=143 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784397 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784430 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784446 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784459 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784471 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784483 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784495 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784506 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784512 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784521 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784527 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784533 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784539 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784546 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784552 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784573 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784581 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784588 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784595 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784602 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784609 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784618 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784627 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784634 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784641 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784652 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784663 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784672 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784679 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784715 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784723 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784731 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784737 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784743 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784750 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784756 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784766 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" event={"ID":"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63","Type":"ContainerDied","Data":"af9c9e0e18853b2aa63161efdfd53cbb41ea8447255e61b04bd6b8c6058b43e1"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784777 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784784 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784791 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784798 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784804 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784811 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784819 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784837 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784844 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784852 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.784867 4732 scope.go:117] "RemoveContainer" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.785021 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kdb2x" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.790435 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/2.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.792164 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/1.log" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.792219 4732 generic.go:334] "Generic (PLEG): container finished" podID="d94cc3c3-3cb6-4a5b-996b-90099415f9bf" containerID="a37376247646e5fbbfa00dc811de67eb8ada41b85d70795bc6c90d092c37d808" exitCode=2 Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.792250 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerDied","Data":"a37376247646e5fbbfa00dc811de67eb8ada41b85d70795bc6c90d092c37d808"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.792273 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209"} Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.792828 4732 scope.go:117] "RemoveContainer" containerID="a37376247646e5fbbfa00dc811de67eb8ada41b85d70795bc6c90d092c37d808" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.793014 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pnlkp_openshift-multus(d94cc3c3-3cb6-4a5b-996b-90099415f9bf)\"" pod="openshift-multus/multus-pnlkp" podUID="d94cc3c3-3cb6-4a5b-996b-90099415f9bf" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.799858 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hbj7g"] Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800054 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-acl-logging" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800065 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-acl-logging" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800075 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800084 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800091 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kubecfg-setup" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800096 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kubecfg-setup" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800104 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="nbdb" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800109 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="nbdb" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800118 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800124 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800135 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="sbdb" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800141 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="sbdb" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800152 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800158 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800167 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-ovn-metrics" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800172 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-ovn-metrics" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800179 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="northd" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800186 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="northd" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800195 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-node" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800202 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-node" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800213 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800219 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800308 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800317 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-node" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800328 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="kube-rbac-proxy-ovn-metrics" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800334 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800343 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800349 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="nbdb" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800356 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="sbdb" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800363 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800369 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800401 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovn-acl-logging" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800409 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800415 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="northd" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800493 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800500 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.800507 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.800512 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" containerName="ovnkube-controller" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.802526 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.813450 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.839533 4732 scope.go:117] "RemoveContainer" containerID="ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846197 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-bin\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-netns\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846314 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-ovn-kubernetes\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846336 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-systemd-units\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846365 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-netd\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846361 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846470 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846503 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846518 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846546 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846666 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.846396 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848461 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-node-log" (OuterVolumeSpecName: "node-log") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848497 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-node-log\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848534 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-kubelet\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848566 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-ovn\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848602 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848618 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848740 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgdch\" (UniqueName: \"kubernetes.io/projected/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-kube-api-access-xgdch\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848793 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-config\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.848810 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-systemd\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849207 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849250 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-openvswitch\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849269 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-var-lib-openvswitch\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849318 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849337 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849356 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-env-overrides\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849772 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849806 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-script-lib\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849928 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-slash\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849945 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-etc-openvswitch\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.849959 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-log-socket\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850068 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovn-node-metrics-cert\") pod \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\" (UID: \"f77a19b4-118c-4b7d-9ef2-b7be7fd33e63\") " Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850001 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-slash" (OuterVolumeSpecName: "host-slash") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850016 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-log-socket" (OuterVolumeSpecName: "log-socket") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850189 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850188 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850217 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-systemd\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850245 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-etc-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850289 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-env-overrides\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850306 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850365 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-ovnkube-script-lib\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850388 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-run-ovn-kubernetes\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850437 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-cni-netd\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850498 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-systemd-units\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850617 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-ovnkube-config\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850674 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-log-socket\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850746 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-run-netns\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850806 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-slash\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850836 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4z2b\" (UniqueName: \"kubernetes.io/projected/4666b47c-ea37-4f86-b613-2789a0f6b193-kube-api-access-w4z2b\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850856 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-ovn\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4666b47c-ea37-4f86-b613-2789a0f6b193-ovn-node-metrics-cert\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850912 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-node-log\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.850951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-cni-bin\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.851383 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-var-lib-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852306 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-kubelet\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852479 4732 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852509 4732 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852529 4732 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852546 4732 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852562 4732 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852584 4732 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852602 4732 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-node-log\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852620 4732 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852639 4732 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852658 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852672 4732 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852829 4732 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852848 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852870 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852900 4732 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-host-slash\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852916 4732 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.852932 4732 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-log-socket\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.857145 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-kube-api-access-xgdch" (OuterVolumeSpecName: "kube-api-access-xgdch") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "kube-api-access-xgdch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.857910 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.865954 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" (UID: "f77a19b4-118c-4b7d-9ef2-b7be7fd33e63"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.903016 4732 scope.go:117] "RemoveContainer" containerID="a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.917824 4732 scope.go:117] "RemoveContainer" containerID="a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.928524 4732 scope.go:117] "RemoveContainer" containerID="f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.940919 4732 scope.go:117] "RemoveContainer" containerID="880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953150 4732 scope.go:117] "RemoveContainer" containerID="401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953590 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-ovnkube-script-lib\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953639 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-run-ovn-kubernetes\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953670 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-cni-netd\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953710 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-systemd-units\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-ovnkube-config\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953745 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-log-socket\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953742 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-run-ovn-kubernetes\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953764 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-run-netns\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953812 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-systemd-units\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953812 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-log-socket\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953852 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-run-netns\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953856 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-cni-netd\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953888 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-slash\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.953906 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-slash\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954038 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4z2b\" (UniqueName: \"kubernetes.io/projected/4666b47c-ea37-4f86-b613-2789a0f6b193-kube-api-access-w4z2b\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954251 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-ovn\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4666b47c-ea37-4f86-b613-2789a0f6b193-ovn-node-metrics-cert\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-node-log\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-ovnkube-config\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954513 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-ovnkube-script-lib\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-cni-bin\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954565 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-node-log\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-ovn\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-cni-bin\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954636 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-var-lib-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954710 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-kubelet\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954733 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-systemd\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954771 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-var-lib-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-etc-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954784 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-kubelet\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954830 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-systemd\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954822 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-etc-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954856 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-env-overrides\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954871 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-run-openvswitch\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954922 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.954980 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4666b47c-ea37-4f86-b613-2789a0f6b193-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.955025 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgdch\" (UniqueName: \"kubernetes.io/projected/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-kube-api-access-xgdch\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.955039 4732 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.955049 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.955394 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4666b47c-ea37-4f86-b613-2789a0f6b193-env-overrides\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.957339 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4666b47c-ea37-4f86-b613-2789a0f6b193-ovn-node-metrics-cert\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.966082 4732 scope.go:117] "RemoveContainer" containerID="8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.972021 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4z2b\" (UniqueName: \"kubernetes.io/projected/4666b47c-ea37-4f86-b613-2789a0f6b193-kube-api-access-w4z2b\") pod \"ovnkube-node-hbj7g\" (UID: \"4666b47c-ea37-4f86-b613-2789a0f6b193\") " pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.981151 4732 scope.go:117] "RemoveContainer" containerID="6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.994605 4732 scope.go:117] "RemoveContainer" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.995064 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": container with ID starting with 74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d not found: ID does not exist" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.995097 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} err="failed to get container status \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": rpc error: code = NotFound desc = could not find container \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": container with ID starting with 74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.995119 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.995518 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": container with ID starting with 560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479 not found: ID does not exist" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.995563 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} err="failed to get container status \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": rpc error: code = NotFound desc = could not find container \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": container with ID starting with 560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.995591 4732 scope.go:117] "RemoveContainer" containerID="ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.995980 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": container with ID starting with ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1 not found: ID does not exist" containerID="ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.996005 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} err="failed to get container status \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": rpc error: code = NotFound desc = could not find container \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": container with ID starting with ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.996018 4732 scope.go:117] "RemoveContainer" containerID="a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.996504 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": container with ID starting with a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30 not found: ID does not exist" containerID="a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.996551 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} err="failed to get container status \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": rpc error: code = NotFound desc = could not find container \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": container with ID starting with a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.996583 4732 scope.go:117] "RemoveContainer" containerID="a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.996889 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": container with ID starting with a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd not found: ID does not exist" containerID="a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.996910 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} err="failed to get container status \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": rpc error: code = NotFound desc = could not find container \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": container with ID starting with a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.996924 4732 scope.go:117] "RemoveContainer" containerID="f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.997134 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": container with ID starting with f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444 not found: ID does not exist" containerID="f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.997162 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} err="failed to get container status \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": rpc error: code = NotFound desc = could not find container \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": container with ID starting with f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.997178 4732 scope.go:117] "RemoveContainer" containerID="880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.997481 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": container with ID starting with 880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908 not found: ID does not exist" containerID="880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.997507 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} err="failed to get container status \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": rpc error: code = NotFound desc = could not find container \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": container with ID starting with 880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.997522 4732 scope.go:117] "RemoveContainer" containerID="401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.997790 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": container with ID starting with 401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59 not found: ID does not exist" containerID="401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.997811 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} err="failed to get container status \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": rpc error: code = NotFound desc = could not find container \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": container with ID starting with 401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.997823 4732 scope.go:117] "RemoveContainer" containerID="8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.998040 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": container with ID starting with 8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73 not found: ID does not exist" containerID="8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998066 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} err="failed to get container status \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": rpc error: code = NotFound desc = could not find container \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": container with ID starting with 8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998080 4732 scope.go:117] "RemoveContainer" containerID="6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0" Oct 10 07:02:39 crc kubenswrapper[4732]: E1010 07:02:39.998301 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": container with ID starting with 6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0 not found: ID does not exist" containerID="6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998330 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} err="failed to get container status \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": rpc error: code = NotFound desc = could not find container \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": container with ID starting with 6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998354 4732 scope.go:117] "RemoveContainer" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998572 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} err="failed to get container status \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": rpc error: code = NotFound desc = could not find container \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": container with ID starting with 74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998593 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998833 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} err="failed to get container status \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": rpc error: code = NotFound desc = could not find container \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": container with ID starting with 560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.998853 4732 scope.go:117] "RemoveContainer" containerID="ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.999154 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} err="failed to get container status \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": rpc error: code = NotFound desc = could not find container \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": container with ID starting with ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.999172 4732 scope.go:117] "RemoveContainer" containerID="a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.999433 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} err="failed to get container status \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": rpc error: code = NotFound desc = could not find container \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": container with ID starting with a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30 not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.999450 4732 scope.go:117] "RemoveContainer" containerID="a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.999787 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} err="failed to get container status \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": rpc error: code = NotFound desc = could not find container \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": container with ID starting with a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd not found: ID does not exist" Oct 10 07:02:39 crc kubenswrapper[4732]: I1010 07:02:39.999814 4732 scope.go:117] "RemoveContainer" containerID="f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.000246 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} err="failed to get container status \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": rpc error: code = NotFound desc = could not find container \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": container with ID starting with f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.000290 4732 scope.go:117] "RemoveContainer" containerID="880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.000585 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} err="failed to get container status \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": rpc error: code = NotFound desc = could not find container \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": container with ID starting with 880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.000610 4732 scope.go:117] "RemoveContainer" containerID="401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.000941 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} err="failed to get container status \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": rpc error: code = NotFound desc = could not find container \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": container with ID starting with 401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.000961 4732 scope.go:117] "RemoveContainer" containerID="8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.001125 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} err="failed to get container status \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": rpc error: code = NotFound desc = could not find container \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": container with ID starting with 8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.001144 4732 scope.go:117] "RemoveContainer" containerID="6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.001379 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} err="failed to get container status \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": rpc error: code = NotFound desc = could not find container \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": container with ID starting with 6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.001397 4732 scope.go:117] "RemoveContainer" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.001598 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} err="failed to get container status \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": rpc error: code = NotFound desc = could not find container \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": container with ID starting with 74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.001628 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.002522 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} err="failed to get container status \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": rpc error: code = NotFound desc = could not find container \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": container with ID starting with 560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.002542 4732 scope.go:117] "RemoveContainer" containerID="ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.002886 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} err="failed to get container status \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": rpc error: code = NotFound desc = could not find container \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": container with ID starting with ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.002905 4732 scope.go:117] "RemoveContainer" containerID="a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.003114 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} err="failed to get container status \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": rpc error: code = NotFound desc = could not find container \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": container with ID starting with a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.003130 4732 scope.go:117] "RemoveContainer" containerID="a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.003495 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} err="failed to get container status \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": rpc error: code = NotFound desc = could not find container \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": container with ID starting with a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.003519 4732 scope.go:117] "RemoveContainer" containerID="f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.003871 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} err="failed to get container status \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": rpc error: code = NotFound desc = could not find container \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": container with ID starting with f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.003890 4732 scope.go:117] "RemoveContainer" containerID="880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.004161 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} err="failed to get container status \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": rpc error: code = NotFound desc = could not find container \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": container with ID starting with 880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.004179 4732 scope.go:117] "RemoveContainer" containerID="401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.004536 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} err="failed to get container status \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": rpc error: code = NotFound desc = could not find container \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": container with ID starting with 401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.004561 4732 scope.go:117] "RemoveContainer" containerID="8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.005133 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} err="failed to get container status \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": rpc error: code = NotFound desc = could not find container \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": container with ID starting with 8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.005157 4732 scope.go:117] "RemoveContainer" containerID="6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.005475 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} err="failed to get container status \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": rpc error: code = NotFound desc = could not find container \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": container with ID starting with 6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.005494 4732 scope.go:117] "RemoveContainer" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.005861 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} err="failed to get container status \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": rpc error: code = NotFound desc = could not find container \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": container with ID starting with 74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.005892 4732 scope.go:117] "RemoveContainer" containerID="560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006137 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479"} err="failed to get container status \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": rpc error: code = NotFound desc = could not find container \"560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479\": container with ID starting with 560072c4a6b2e8a718ba3d6e5f1e225c01c108783318f8bfc3302f9054f73479 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006158 4732 scope.go:117] "RemoveContainer" containerID="ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006442 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1"} err="failed to get container status \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": rpc error: code = NotFound desc = could not find container \"ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1\": container with ID starting with ef285a7ba683d8bf34bbdba92dfe336fa4b01103eb9463708ba827ea02ec14a1 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006459 4732 scope.go:117] "RemoveContainer" containerID="a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006717 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30"} err="failed to get container status \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": rpc error: code = NotFound desc = could not find container \"a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30\": container with ID starting with a1bedd2a2d548c24ceaa5c3426abd90dc711e005aafeb01c42862f068d3d5d30 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006745 4732 scope.go:117] "RemoveContainer" containerID="a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006968 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd"} err="failed to get container status \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": rpc error: code = NotFound desc = could not find container \"a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd\": container with ID starting with a09b8d03ce8d28ee96d852ce766021fcdbd7ebaffc04fa353bf132d07e9e20dd not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.006988 4732 scope.go:117] "RemoveContainer" containerID="f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.007262 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444"} err="failed to get container status \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": rpc error: code = NotFound desc = could not find container \"f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444\": container with ID starting with f0025a28cf63ef4d02b4cbb6c03a8f33e14d66d007bee4f5e89f5d107ef48444 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.007281 4732 scope.go:117] "RemoveContainer" containerID="880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.007504 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908"} err="failed to get container status \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": rpc error: code = NotFound desc = could not find container \"880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908\": container with ID starting with 880beef94a7373a15b9147449c21c4328b4ab5fa8e73e1718f9c8434bb8f4908 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.007521 4732 scope.go:117] "RemoveContainer" containerID="401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.007742 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59"} err="failed to get container status \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": rpc error: code = NotFound desc = could not find container \"401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59\": container with ID starting with 401d6340dba6b29d5dd515dab7756b6d1b8688b405b7440b8886ec8f9e8eff59 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.007760 4732 scope.go:117] "RemoveContainer" containerID="8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.008134 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73"} err="failed to get container status \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": rpc error: code = NotFound desc = could not find container \"8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73\": container with ID starting with 8f7e6f78c1dab2a724fa2ebb83457f0e535ae133d6c59ec970b7e0f3c85b1c73 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.008151 4732 scope.go:117] "RemoveContainer" containerID="6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.008370 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0"} err="failed to get container status \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": rpc error: code = NotFound desc = could not find container \"6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0\": container with ID starting with 6aae39802504d179d5b3b83f98cd306d055491ef538f76758dfe27a1e07d26a0 not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.008399 4732 scope.go:117] "RemoveContainer" containerID="74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.008667 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d"} err="failed to get container status \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": rpc error: code = NotFound desc = could not find container \"74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d\": container with ID starting with 74a2f2f92fc5930b2327f4ad8a9693efbaf7fc7721e65343cea34f77fe59b39d not found: ID does not exist" Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.120947 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kdb2x"] Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.123350 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kdb2x"] Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.126706 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:40 crc kubenswrapper[4732]: W1010 07:02:40.142205 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4666b47c_ea37_4f86_b613_2789a0f6b193.slice/crio-9165fb8ade0af619fff5e31c736cb5788b66db292ce24138104e2d5a9c37902d WatchSource:0}: Error finding container 9165fb8ade0af619fff5e31c736cb5788b66db292ce24138104e2d5a9c37902d: Status 404 returned error can't find the container with id 9165fb8ade0af619fff5e31c736cb5788b66db292ce24138104e2d5a9c37902d Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.801679 4732 generic.go:334] "Generic (PLEG): container finished" podID="4666b47c-ea37-4f86-b613-2789a0f6b193" containerID="b1a4c5b152ae84ae508f91c045acc7239973ea7f3159996805b0ff2aa8a0f0d5" exitCode=0 Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.802187 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerDied","Data":"b1a4c5b152ae84ae508f91c045acc7239973ea7f3159996805b0ff2aa8a0f0d5"} Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.802226 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"9165fb8ade0af619fff5e31c736cb5788b66db292ce24138104e2d5a9c37902d"} Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.804766 4732 generic.go:334] "Generic (PLEG): container finished" podID="928bd8d4-82cc-4c7c-8d05-342e3e4b13db" containerID="e76947e3580479b5fbc545e5760973d36044777457306df7cc25f18c02d47e43" exitCode=0 Oct 10 07:02:40 crc kubenswrapper[4732]: I1010 07:02:40.804833 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qqmtc" event={"ID":"928bd8d4-82cc-4c7c-8d05-342e3e4b13db","Type":"ContainerDied","Data":"e76947e3580479b5fbc545e5760973d36044777457306df7cc25f18c02d47e43"} Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.667759 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f77a19b4-118c-4b7d-9ef2-b7be7fd33e63" path="/var/lib/kubelet/pods/f77a19b4-118c-4b7d-9ef2-b7be7fd33e63/volumes" Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.814523 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"e3c24d31b65d4304724723a210a9a6268f46e35f4304dc17dacdcf0c53e95eb8"} Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.814910 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"62784689d3e6f9ed31e915b1f5f4ffcaa7d886de0592c0b7c1e3767eb6fd6756"} Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.814924 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"4b86dd46e465fdd7c07d323e5c77efd9673913e928c77f3b3141b83e46ef359c"} Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.814933 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"d3862ed57e92797592f0533116a110bc79ce6f04340ad73ec15cac92b475c49d"} Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.814943 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"33f262d79b1e9cccc24143ef2d1c7aa06746eb4a3cb306f9f6319bc2bdbb8e2f"} Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.814953 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"4903b5c490fe6506d47464d186eb8f9c05e0a5a4acd4eb29cf101664144bc852"} Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.894135 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.980166 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-node-mnt\") pod \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.980242 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "928bd8d4-82cc-4c7c-8d05-342e3e4b13db" (UID: "928bd8d4-82cc-4c7c-8d05-342e3e4b13db"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.980275 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbvh2\" (UniqueName: \"kubernetes.io/projected/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-kube-api-access-pbvh2\") pod \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.980302 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-crc-storage\") pod \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\" (UID: \"928bd8d4-82cc-4c7c-8d05-342e3e4b13db\") " Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.980622 4732 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:41 crc kubenswrapper[4732]: I1010 07:02:41.988176 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-kube-api-access-pbvh2" (OuterVolumeSpecName: "kube-api-access-pbvh2") pod "928bd8d4-82cc-4c7c-8d05-342e3e4b13db" (UID: "928bd8d4-82cc-4c7c-8d05-342e3e4b13db"). InnerVolumeSpecName "kube-api-access-pbvh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:02:42 crc kubenswrapper[4732]: I1010 07:02:42.009591 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "928bd8d4-82cc-4c7c-8d05-342e3e4b13db" (UID: "928bd8d4-82cc-4c7c-8d05-342e3e4b13db"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:02:42 crc kubenswrapper[4732]: I1010 07:02:42.081614 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbvh2\" (UniqueName: \"kubernetes.io/projected/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-kube-api-access-pbvh2\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:42 crc kubenswrapper[4732]: I1010 07:02:42.081674 4732 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/928bd8d4-82cc-4c7c-8d05-342e3e4b13db-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 10 07:02:42 crc kubenswrapper[4732]: I1010 07:02:42.823385 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qqmtc" event={"ID":"928bd8d4-82cc-4c7c-8d05-342e3e4b13db","Type":"ContainerDied","Data":"43b0d67c5fa08cc5d7da22ca0c651a0b24300943d75af9dc0b728131a33f11f2"} Oct 10 07:02:42 crc kubenswrapper[4732]: I1010 07:02:42.823448 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b0d67c5fa08cc5d7da22ca0c651a0b24300943d75af9dc0b728131a33f11f2" Oct 10 07:02:42 crc kubenswrapper[4732]: I1010 07:02:42.823504 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qqmtc" Oct 10 07:02:43 crc kubenswrapper[4732]: I1010 07:02:43.837130 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"258ca2edac3bf17d8475880e54d27b7037fec4a813068e64ef3bc4b2f8d9d439"} Oct 10 07:02:46 crc kubenswrapper[4732]: I1010 07:02:46.863991 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" event={"ID":"4666b47c-ea37-4f86-b613-2789a0f6b193","Type":"ContainerStarted","Data":"4ca6526e5e6542a65b761fc2d551c8e6ad13a22436261b63b7e41d39ab6a3b5f"} Oct 10 07:02:46 crc kubenswrapper[4732]: I1010 07:02:46.865547 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:46 crc kubenswrapper[4732]: I1010 07:02:46.897990 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" podStartSLOduration=7.897966255 podStartE2EDuration="7.897966255s" podCreationTimestamp="2025-10-10 07:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:02:46.895956851 +0000 UTC m=+693.965548112" watchObservedRunningTime="2025-10-10 07:02:46.897966255 +0000 UTC m=+693.967557496" Oct 10 07:02:46 crc kubenswrapper[4732]: I1010 07:02:46.901063 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:47 crc kubenswrapper[4732]: I1010 07:02:47.869141 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:47 crc kubenswrapper[4732]: I1010 07:02:47.869193 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:47 crc kubenswrapper[4732]: I1010 07:02:47.903808 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.386758 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb"] Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.387000 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928bd8d4-82cc-4c7c-8d05-342e3e4b13db" containerName="storage" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.387013 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="928bd8d4-82cc-4c7c-8d05-342e3e4b13db" containerName="storage" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.387114 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="928bd8d4-82cc-4c7c-8d05-342e3e4b13db" containerName="storage" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.387958 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.391150 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.398545 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb"] Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.478303 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.478871 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ccpz\" (UniqueName: \"kubernetes.io/projected/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-kube-api-access-2ccpz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.479020 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.580512 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.580580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ccpz\" (UniqueName: \"kubernetes.io/projected/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-kube-api-access-2ccpz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.580612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.581153 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.581403 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.605855 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ccpz\" (UniqueName: \"kubernetes.io/projected/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-kube-api-access-2ccpz\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.704630 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.737155 4732 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(f26bf81718118b17d7fae6f7c0aea5141e9ac580ab4ce59fede8cc6347af6dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.737244 4732 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(f26bf81718118b17d7fae6f7c0aea5141e9ac580ab4ce59fede8cc6347af6dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.737270 4732 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(f26bf81718118b17d7fae6f7c0aea5141e9ac580ab4ce59fede8cc6347af6dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.737332 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace(76a7c2bc-c1e1-43e2-8398-b1d908b2d00e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace(76a7c2bc-c1e1-43e2-8398-b1d908b2d00e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(f26bf81718118b17d7fae6f7c0aea5141e9ac580ab4ce59fede8cc6347af6dc4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.879346 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: I1010 07:02:49.879852 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.911728 4732 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(8787e5b10d8b01e7e74638fb511ac9a67d1d5b6fc86a96e05b553f222fbfe23f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.911824 4732 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(8787e5b10d8b01e7e74638fb511ac9a67d1d5b6fc86a96e05b553f222fbfe23f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.911900 4732 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(8787e5b10d8b01e7e74638fb511ac9a67d1d5b6fc86a96e05b553f222fbfe23f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:02:49 crc kubenswrapper[4732]: E1010 07:02:49.911996 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace(76a7c2bc-c1e1-43e2-8398-b1d908b2d00e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace(76a7c2bc-c1e1-43e2-8398-b1d908b2d00e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(8787e5b10d8b01e7e74638fb511ac9a67d1d5b6fc86a96e05b553f222fbfe23f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" Oct 10 07:02:52 crc kubenswrapper[4732]: I1010 07:02:52.660578 4732 scope.go:117] "RemoveContainer" containerID="a37376247646e5fbbfa00dc811de67eb8ada41b85d70795bc6c90d092c37d808" Oct 10 07:02:52 crc kubenswrapper[4732]: E1010 07:02:52.661117 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pnlkp_openshift-multus(d94cc3c3-3cb6-4a5b-996b-90099415f9bf)\"" pod="openshift-multus/multus-pnlkp" podUID="d94cc3c3-3cb6-4a5b-996b-90099415f9bf" Oct 10 07:02:55 crc kubenswrapper[4732]: I1010 07:02:55.356800 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:02:55 crc kubenswrapper[4732]: I1010 07:02:55.357288 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:03:00 crc kubenswrapper[4732]: I1010 07:03:00.659986 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:00 crc kubenswrapper[4732]: I1010 07:03:00.660743 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:00 crc kubenswrapper[4732]: E1010 07:03:00.695615 4732 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(9f68f2ab9f9b8d0a338bbd4a7f71310b315ddc25e867b5c8f92ab99b82edbb89): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 10 07:03:00 crc kubenswrapper[4732]: E1010 07:03:00.695685 4732 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(9f68f2ab9f9b8d0a338bbd4a7f71310b315ddc25e867b5c8f92ab99b82edbb89): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:00 crc kubenswrapper[4732]: E1010 07:03:00.695724 4732 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(9f68f2ab9f9b8d0a338bbd4a7f71310b315ddc25e867b5c8f92ab99b82edbb89): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:00 crc kubenswrapper[4732]: E1010 07:03:00.695779 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace(76a7c2bc-c1e1-43e2-8398-b1d908b2d00e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace(76a7c2bc-c1e1-43e2-8398-b1d908b2d00e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_openshift-marketplace_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e_0(9f68f2ab9f9b8d0a338bbd4a7f71310b315ddc25e867b5c8f92ab99b82edbb89): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" Oct 10 07:03:07 crc kubenswrapper[4732]: I1010 07:03:07.660912 4732 scope.go:117] "RemoveContainer" containerID="a37376247646e5fbbfa00dc811de67eb8ada41b85d70795bc6c90d092c37d808" Oct 10 07:03:07 crc kubenswrapper[4732]: I1010 07:03:07.979230 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/2.log" Oct 10 07:03:07 crc kubenswrapper[4732]: I1010 07:03:07.980127 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/1.log" Oct 10 07:03:07 crc kubenswrapper[4732]: I1010 07:03:07.980197 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnlkp" event={"ID":"d94cc3c3-3cb6-4a5b-996b-90099415f9bf","Type":"ContainerStarted","Data":"faedb0ca6bb5e5162ec1f790a862c590637ceb5e7f6f6e971448ca8c36905fd2"} Oct 10 07:03:10 crc kubenswrapper[4732]: I1010 07:03:10.148053 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hbj7g" Oct 10 07:03:11 crc kubenswrapper[4732]: I1010 07:03:11.659533 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:11 crc kubenswrapper[4732]: I1010 07:03:11.660144 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:11 crc kubenswrapper[4732]: I1010 07:03:11.845010 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb"] Oct 10 07:03:12 crc kubenswrapper[4732]: I1010 07:03:12.000039 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" event={"ID":"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e","Type":"ContainerStarted","Data":"efa4ee6fb941a5354d370775a310b068c1077e09c00a9c902c171eece1d858c1"} Oct 10 07:03:12 crc kubenswrapper[4732]: I1010 07:03:12.000088 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" event={"ID":"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e","Type":"ContainerStarted","Data":"491e0e9ea7b95fe80f298f9bbd5e66d7578eed6ae42adc78afae111c53f5f051"} Oct 10 07:03:13 crc kubenswrapper[4732]: I1010 07:03:13.007335 4732 generic.go:334] "Generic (PLEG): container finished" podID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerID="efa4ee6fb941a5354d370775a310b068c1077e09c00a9c902c171eece1d858c1" exitCode=0 Oct 10 07:03:13 crc kubenswrapper[4732]: I1010 07:03:13.007437 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" event={"ID":"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e","Type":"ContainerDied","Data":"efa4ee6fb941a5354d370775a310b068c1077e09c00a9c902c171eece1d858c1"} Oct 10 07:03:14 crc kubenswrapper[4732]: I1010 07:03:14.000233 4732 scope.go:117] "RemoveContainer" containerID="c1d0b712d22e22b7b5332f059c85b9e9cd6e1b1476d6baad732d5f6321066209" Oct 10 07:03:15 crc kubenswrapper[4732]: I1010 07:03:15.023264 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" event={"ID":"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e","Type":"ContainerStarted","Data":"01805e06d12e8c910cf5b59bc1b146826c94668da01319a5bc5eda0870b73964"} Oct 10 07:03:15 crc kubenswrapper[4732]: I1010 07:03:15.025136 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnlkp_d94cc3c3-3cb6-4a5b-996b-90099415f9bf/kube-multus/2.log" Oct 10 07:03:16 crc kubenswrapper[4732]: I1010 07:03:16.035053 4732 generic.go:334] "Generic (PLEG): container finished" podID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerID="01805e06d12e8c910cf5b59bc1b146826c94668da01319a5bc5eda0870b73964" exitCode=0 Oct 10 07:03:16 crc kubenswrapper[4732]: I1010 07:03:16.035110 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" event={"ID":"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e","Type":"ContainerDied","Data":"01805e06d12e8c910cf5b59bc1b146826c94668da01319a5bc5eda0870b73964"} Oct 10 07:03:17 crc kubenswrapper[4732]: I1010 07:03:17.044448 4732 generic.go:334] "Generic (PLEG): container finished" podID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerID="60097ab834763416304709e7eaedfb31be9b174e844e49a723ddf4eb67aa00c1" exitCode=0 Oct 10 07:03:17 crc kubenswrapper[4732]: I1010 07:03:17.044544 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" event={"ID":"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e","Type":"ContainerDied","Data":"60097ab834763416304709e7eaedfb31be9b174e844e49a723ddf4eb67aa00c1"} Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.293512 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.353727 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-util\") pod \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.353808 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-bundle\") pod \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.353858 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ccpz\" (UniqueName: \"kubernetes.io/projected/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-kube-api-access-2ccpz\") pod \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\" (UID: \"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e\") " Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.354455 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-bundle" (OuterVolumeSpecName: "bundle") pod "76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" (UID: "76a7c2bc-c1e1-43e2-8398-b1d908b2d00e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.359139 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-kube-api-access-2ccpz" (OuterVolumeSpecName: "kube-api-access-2ccpz") pod "76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" (UID: "76a7c2bc-c1e1-43e2-8398-b1d908b2d00e"). InnerVolumeSpecName "kube-api-access-2ccpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.364952 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-util" (OuterVolumeSpecName: "util") pod "76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" (UID: "76a7c2bc-c1e1-43e2-8398-b1d908b2d00e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.455984 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ccpz\" (UniqueName: \"kubernetes.io/projected/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-kube-api-access-2ccpz\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.456064 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-util\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:18 crc kubenswrapper[4732]: I1010 07:03:18.456087 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76a7c2bc-c1e1-43e2-8398-b1d908b2d00e-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:19 crc kubenswrapper[4732]: I1010 07:03:19.058727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" event={"ID":"76a7c2bc-c1e1-43e2-8398-b1d908b2d00e","Type":"ContainerDied","Data":"491e0e9ea7b95fe80f298f9bbd5e66d7578eed6ae42adc78afae111c53f5f051"} Oct 10 07:03:19 crc kubenswrapper[4732]: I1010 07:03:19.058782 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="491e0e9ea7b95fe80f298f9bbd5e66d7578eed6ae42adc78afae111c53f5f051" Oct 10 07:03:19 crc kubenswrapper[4732]: I1010 07:03:19.058800 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.115810 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd"] Oct 10 07:03:21 crc kubenswrapper[4732]: E1010 07:03:21.116461 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerName="util" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.116476 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerName="util" Oct 10 07:03:21 crc kubenswrapper[4732]: E1010 07:03:21.116486 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerName="extract" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.116495 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerName="extract" Oct 10 07:03:21 crc kubenswrapper[4732]: E1010 07:03:21.116510 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerName="pull" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.116518 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerName="pull" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.116634 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a7c2bc-c1e1-43e2-8398-b1d908b2d00e" containerName="extract" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.117109 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.119146 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ls2p8" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.120646 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.126397 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.144361 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd"] Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.187844 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wrz8\" (UniqueName: \"kubernetes.io/projected/e1c7f94b-267b-420a-bd99-7d34c8b02a22-kube-api-access-5wrz8\") pod \"nmstate-operator-858ddd8f98-sz7pd\" (UID: \"e1c7f94b-267b-420a-bd99-7d34c8b02a22\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.288907 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wrz8\" (UniqueName: \"kubernetes.io/projected/e1c7f94b-267b-420a-bd99-7d34c8b02a22-kube-api-access-5wrz8\") pod \"nmstate-operator-858ddd8f98-sz7pd\" (UID: \"e1c7f94b-267b-420a-bd99-7d34c8b02a22\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.305991 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wrz8\" (UniqueName: \"kubernetes.io/projected/e1c7f94b-267b-420a-bd99-7d34c8b02a22-kube-api-access-5wrz8\") pod \"nmstate-operator-858ddd8f98-sz7pd\" (UID: \"e1c7f94b-267b-420a-bd99-7d34c8b02a22\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.434200 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" Oct 10 07:03:21 crc kubenswrapper[4732]: I1010 07:03:21.713067 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd"] Oct 10 07:03:22 crc kubenswrapper[4732]: I1010 07:03:22.073180 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" event={"ID":"e1c7f94b-267b-420a-bd99-7d34c8b02a22","Type":"ContainerStarted","Data":"b6f228726540dcfa69aea3eec99572f82fdb6670402909d4ef8a0d1c53d94e66"} Oct 10 07:03:25 crc kubenswrapper[4732]: I1010 07:03:25.356006 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:03:25 crc kubenswrapper[4732]: I1010 07:03:25.356623 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:03:28 crc kubenswrapper[4732]: I1010 07:03:28.107597 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" event={"ID":"e1c7f94b-267b-420a-bd99-7d34c8b02a22","Type":"ContainerStarted","Data":"4c49978bb2df625adf0bc16066277396af2092dd837943926358682f5d1c7cdc"} Oct 10 07:03:28 crc kubenswrapper[4732]: I1010 07:03:28.128613 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-sz7pd" podStartSLOduration=1.102568364 podStartE2EDuration="7.128597371s" podCreationTimestamp="2025-10-10 07:03:21 +0000 UTC" firstStartedPulling="2025-10-10 07:03:21.728944298 +0000 UTC m=+728.798535539" lastFinishedPulling="2025-10-10 07:03:27.754973305 +0000 UTC m=+734.824564546" observedRunningTime="2025-10-10 07:03:28.126970388 +0000 UTC m=+735.196561649" watchObservedRunningTime="2025-10-10 07:03:28.128597371 +0000 UTC m=+735.198188612" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.842693 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h"] Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.844146 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.847211 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lfczg" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.855997 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h"] Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.867445 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv"] Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.868271 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.870819 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.879171 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hnbqn"] Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.880017 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.888124 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv"] Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.902757 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nm5l\" (UniqueName: \"kubernetes.io/projected/7a16ea47-04cf-4b90-8380-35ad716c299c-kube-api-access-7nm5l\") pod \"nmstate-metrics-fdff9cb8d-kd86h\" (UID: \"7a16ea47-04cf-4b90-8380-35ad716c299c\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.902823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd7vn\" (UniqueName: \"kubernetes.io/projected/4434a14e-fb27-4cfa-adef-4d4f02e5a775-kube-api-access-dd7vn\") pod \"nmstate-webhook-6cdbc54649-7f9qv\" (UID: \"4434a14e-fb27-4cfa-adef-4d4f02e5a775\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.902858 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4434a14e-fb27-4cfa-adef-4d4f02e5a775-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7f9qv\" (UID: \"4434a14e-fb27-4cfa-adef-4d4f02e5a775\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.993252 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95"] Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.993909 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.995780 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.996008 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 10 07:03:29 crc kubenswrapper[4732]: I1010 07:03:29.996149 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6bs7f" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.009019 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95"] Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.025066 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-dbus-socket\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.025119 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-ovs-socket\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.025186 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nm5l\" (UniqueName: \"kubernetes.io/projected/7a16ea47-04cf-4b90-8380-35ad716c299c-kube-api-access-7nm5l\") pod \"nmstate-metrics-fdff9cb8d-kd86h\" (UID: \"7a16ea47-04cf-4b90-8380-35ad716c299c\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.025210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd7vn\" (UniqueName: \"kubernetes.io/projected/4434a14e-fb27-4cfa-adef-4d4f02e5a775-kube-api-access-dd7vn\") pod \"nmstate-webhook-6cdbc54649-7f9qv\" (UID: \"4434a14e-fb27-4cfa-adef-4d4f02e5a775\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.025227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t2h8\" (UniqueName: \"kubernetes.io/projected/060ec441-2901-4e80-bd22-0b9ece859320-kube-api-access-5t2h8\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.025252 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-nmstate-lock\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.025271 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4434a14e-fb27-4cfa-adef-4d4f02e5a775-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7f9qv\" (UID: \"4434a14e-fb27-4cfa-adef-4d4f02e5a775\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:30 crc kubenswrapper[4732]: E1010 07:03:30.025440 4732 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 10 07:03:30 crc kubenswrapper[4732]: E1010 07:03:30.025522 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4434a14e-fb27-4cfa-adef-4d4f02e5a775-tls-key-pair podName:4434a14e-fb27-4cfa-adef-4d4f02e5a775 nodeName:}" failed. No retries permitted until 2025-10-10 07:03:30.525502466 +0000 UTC m=+737.595093707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/4434a14e-fb27-4cfa-adef-4d4f02e5a775-tls-key-pair") pod "nmstate-webhook-6cdbc54649-7f9qv" (UID: "4434a14e-fb27-4cfa-adef-4d4f02e5a775") : secret "openshift-nmstate-webhook" not found Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.048336 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd7vn\" (UniqueName: \"kubernetes.io/projected/4434a14e-fb27-4cfa-adef-4d4f02e5a775-kube-api-access-dd7vn\") pod \"nmstate-webhook-6cdbc54649-7f9qv\" (UID: \"4434a14e-fb27-4cfa-adef-4d4f02e5a775\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.050321 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nm5l\" (UniqueName: \"kubernetes.io/projected/7a16ea47-04cf-4b90-8380-35ad716c299c-kube-api-access-7nm5l\") pod \"nmstate-metrics-fdff9cb8d-kd86h\" (UID: \"7a16ea47-04cf-4b90-8380-35ad716c299c\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.130595 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0013a6-b60e-46d4-b9dd-bd692cc42068-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.130780 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t2h8\" (UniqueName: \"kubernetes.io/projected/060ec441-2901-4e80-bd22-0b9ece859320-kube-api-access-5t2h8\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.130812 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk46w\" (UniqueName: \"kubernetes.io/projected/bc0013a6-b60e-46d4-b9dd-bd692cc42068-kube-api-access-tk46w\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.130840 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-nmstate-lock\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.130926 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-nmstate-lock\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.130939 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-dbus-socket\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.131024 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bc0013a6-b60e-46d4-b9dd-bd692cc42068-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.131060 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-ovs-socket\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.131257 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-ovs-socket\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.131524 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/060ec441-2901-4e80-bd22-0b9ece859320-dbus-socket\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.149464 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t2h8\" (UniqueName: \"kubernetes.io/projected/060ec441-2901-4e80-bd22-0b9ece859320-kube-api-access-5t2h8\") pod \"nmstate-handler-hnbqn\" (UID: \"060ec441-2901-4e80-bd22-0b9ece859320\") " pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.160967 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.206099 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.218533 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f555b5bf4-znpfr"] Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.219822 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.232816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0013a6-b60e-46d4-b9dd-bd692cc42068-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.232877 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk46w\" (UniqueName: \"kubernetes.io/projected/bc0013a6-b60e-46d4-b9dd-bd692cc42068-kube-api-access-tk46w\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.232937 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bc0013a6-b60e-46d4-b9dd-bd692cc42068-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.233951 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bc0013a6-b60e-46d4-b9dd-bd692cc42068-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: E1010 07:03:30.234050 4732 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 10 07:03:30 crc kubenswrapper[4732]: E1010 07:03:30.234122 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc0013a6-b60e-46d4-b9dd-bd692cc42068-plugin-serving-cert podName:bc0013a6-b60e-46d4-b9dd-bd692cc42068 nodeName:}" failed. No retries permitted until 2025-10-10 07:03:30.734104296 +0000 UTC m=+737.803695537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/bc0013a6-b60e-46d4-b9dd-bd692cc42068-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-lcv95" (UID: "bc0013a6-b60e-46d4-b9dd-bd692cc42068") : secret "plugin-serving-cert" not found Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.238781 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f555b5bf4-znpfr"] Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.258391 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk46w\" (UniqueName: \"kubernetes.io/projected/bc0013a6-b60e-46d4-b9dd-bd692cc42068-kube-api-access-tk46w\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.334128 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-oauth-config\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.334682 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-serving-cert\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.334753 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-config\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.334786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-service-ca\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.334817 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkjk\" (UniqueName: \"kubernetes.io/projected/03ec76f2-dd4d-460a-88aa-acd94ab0b898-kube-api-access-pxkjk\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.334840 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-oauth-serving-cert\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.334863 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-trusted-ca-bundle\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.427090 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h"] Oct 10 07:03:30 crc kubenswrapper[4732]: W1010 07:03:30.434607 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a16ea47_04cf_4b90_8380_35ad716c299c.slice/crio-4e3befefe80dc6c4b5d9e6bf08e2cbaac05fdf83b0fc989f546ec1ed9a0732b7 WatchSource:0}: Error finding container 4e3befefe80dc6c4b5d9e6bf08e2cbaac05fdf83b0fc989f546ec1ed9a0732b7: Status 404 returned error can't find the container with id 4e3befefe80dc6c4b5d9e6bf08e2cbaac05fdf83b0fc989f546ec1ed9a0732b7 Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.435569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-service-ca\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.435679 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkjk\" (UniqueName: \"kubernetes.io/projected/03ec76f2-dd4d-460a-88aa-acd94ab0b898-kube-api-access-pxkjk\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.435783 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-oauth-serving-cert\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.435857 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-trusted-ca-bundle\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.435970 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-oauth-config\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.436250 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-serving-cert\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.436341 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-config\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.436566 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-service-ca\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.437178 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-config\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.437274 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-oauth-serving-cert\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.437782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ec76f2-dd4d-460a-88aa-acd94ab0b898-trusted-ca-bundle\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.440670 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-oauth-config\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.442034 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ec76f2-dd4d-460a-88aa-acd94ab0b898-console-serving-cert\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.453299 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkjk\" (UniqueName: \"kubernetes.io/projected/03ec76f2-dd4d-460a-88aa-acd94ab0b898-kube-api-access-pxkjk\") pod \"console-6f555b5bf4-znpfr\" (UID: \"03ec76f2-dd4d-460a-88aa-acd94ab0b898\") " pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.537392 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4434a14e-fb27-4cfa-adef-4d4f02e5a775-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7f9qv\" (UID: \"4434a14e-fb27-4cfa-adef-4d4f02e5a775\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.541147 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/4434a14e-fb27-4cfa-adef-4d4f02e5a775-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-7f9qv\" (UID: \"4434a14e-fb27-4cfa-adef-4d4f02e5a775\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.564157 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.739644 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0013a6-b60e-46d4-b9dd-bd692cc42068-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.743566 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc0013a6-b60e-46d4-b9dd-bd692cc42068-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-lcv95\" (UID: \"bc0013a6-b60e-46d4-b9dd-bd692cc42068\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.790097 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.936962 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.970780 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f555b5bf4-znpfr"] Oct 10 07:03:30 crc kubenswrapper[4732]: I1010 07:03:30.985372 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv"] Oct 10 07:03:31 crc kubenswrapper[4732]: I1010 07:03:31.139671 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hnbqn" event={"ID":"060ec441-2901-4e80-bd22-0b9ece859320","Type":"ContainerStarted","Data":"1073d124d94bd83392cedfe1bb09976af539822b2795346229a30ec4fb475cc0"} Oct 10 07:03:31 crc kubenswrapper[4732]: I1010 07:03:31.141219 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f555b5bf4-znpfr" event={"ID":"03ec76f2-dd4d-460a-88aa-acd94ab0b898","Type":"ContainerStarted","Data":"921d6667b0625a75854e3835c51f48c86af92eb4200eb1e65f24cf1109d6181d"} Oct 10 07:03:31 crc kubenswrapper[4732]: I1010 07:03:31.142841 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" event={"ID":"7a16ea47-04cf-4b90-8380-35ad716c299c","Type":"ContainerStarted","Data":"4e3befefe80dc6c4b5d9e6bf08e2cbaac05fdf83b0fc989f546ec1ed9a0732b7"} Oct 10 07:03:31 crc kubenswrapper[4732]: I1010 07:03:31.147645 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" event={"ID":"4434a14e-fb27-4cfa-adef-4d4f02e5a775","Type":"ContainerStarted","Data":"c96653d0d5ec73ba53dbc4e259034ed3ce18eaa2458da8a9ca632f78a81e83d3"} Oct 10 07:03:31 crc kubenswrapper[4732]: I1010 07:03:31.161194 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95"] Oct 10 07:03:31 crc kubenswrapper[4732]: W1010 07:03:31.169183 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc0013a6_b60e_46d4_b9dd_bd692cc42068.slice/crio-d42bf7f76d16e8985898f0c399a4b7865893ee4e0074b8b0738baeb0dc1bc112 WatchSource:0}: Error finding container d42bf7f76d16e8985898f0c399a4b7865893ee4e0074b8b0738baeb0dc1bc112: Status 404 returned error can't find the container with id d42bf7f76d16e8985898f0c399a4b7865893ee4e0074b8b0738baeb0dc1bc112 Oct 10 07:03:32 crc kubenswrapper[4732]: I1010 07:03:32.154294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" event={"ID":"bc0013a6-b60e-46d4-b9dd-bd692cc42068","Type":"ContainerStarted","Data":"d42bf7f76d16e8985898f0c399a4b7865893ee4e0074b8b0738baeb0dc1bc112"} Oct 10 07:03:32 crc kubenswrapper[4732]: I1010 07:03:32.156256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f555b5bf4-znpfr" event={"ID":"03ec76f2-dd4d-460a-88aa-acd94ab0b898","Type":"ContainerStarted","Data":"e96243614beedd754d9e1a6dc6d2bbdccb4d657563bb125a7ad94bbcf9c5eb58"} Oct 10 07:03:32 crc kubenswrapper[4732]: I1010 07:03:32.183819 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f555b5bf4-znpfr" podStartSLOduration=2.183797157 podStartE2EDuration="2.183797157s" podCreationTimestamp="2025-10-10 07:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:03:32.178961089 +0000 UTC m=+739.248552330" watchObservedRunningTime="2025-10-10 07:03:32.183797157 +0000 UTC m=+739.253388408" Oct 10 07:03:33 crc kubenswrapper[4732]: I1010 07:03:33.166348 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" event={"ID":"7a16ea47-04cf-4b90-8380-35ad716c299c","Type":"ContainerStarted","Data":"a731bb7a739157b91c486b38eaf0fded72f0a78b178ecd9c09b4a5f1e837b012"} Oct 10 07:03:33 crc kubenswrapper[4732]: I1010 07:03:33.167665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" event={"ID":"4434a14e-fb27-4cfa-adef-4d4f02e5a775","Type":"ContainerStarted","Data":"db26f27bf67bced0564c319d9e8440f7651a0ef31f10be9ff50fb6a6bb22d42d"} Oct 10 07:03:33 crc kubenswrapper[4732]: I1010 07:03:33.169595 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hnbqn" event={"ID":"060ec441-2901-4e80-bd22-0b9ece859320","Type":"ContainerStarted","Data":"07a6300099b22740be5c300d9d085bcc05610b70d9eb00d2bf6b059998dc909e"} Oct 10 07:03:33 crc kubenswrapper[4732]: I1010 07:03:33.169831 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:33 crc kubenswrapper[4732]: I1010 07:03:33.199196 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" podStartSLOduration=2.461663198 podStartE2EDuration="4.199174785s" podCreationTimestamp="2025-10-10 07:03:29 +0000 UTC" firstStartedPulling="2025-10-10 07:03:31.003917276 +0000 UTC m=+738.073508517" lastFinishedPulling="2025-10-10 07:03:32.741428863 +0000 UTC m=+739.811020104" observedRunningTime="2025-10-10 07:03:33.197808729 +0000 UTC m=+740.267400020" watchObservedRunningTime="2025-10-10 07:03:33.199174785 +0000 UTC m=+740.268766026" Oct 10 07:03:33 crc kubenswrapper[4732]: I1010 07:03:33.235174 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hnbqn" podStartSLOduration=1.8472133899999998 podStartE2EDuration="4.235121797s" podCreationTimestamp="2025-10-10 07:03:29 +0000 UTC" firstStartedPulling="2025-10-10 07:03:30.267315525 +0000 UTC m=+737.336906756" lastFinishedPulling="2025-10-10 07:03:32.655223922 +0000 UTC m=+739.724815163" observedRunningTime="2025-10-10 07:03:33.225713668 +0000 UTC m=+740.295304919" watchObservedRunningTime="2025-10-10 07:03:33.235121797 +0000 UTC m=+740.304713038" Oct 10 07:03:34 crc kubenswrapper[4732]: I1010 07:03:34.180667 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:35 crc kubenswrapper[4732]: I1010 07:03:35.188872 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" event={"ID":"bc0013a6-b60e-46d4-b9dd-bd692cc42068","Type":"ContainerStarted","Data":"016513ad7c48679368841e14f3fffa56ad8dad01e473dd597b21f1c1a0b5005e"} Oct 10 07:03:35 crc kubenswrapper[4732]: I1010 07:03:35.209139 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-lcv95" podStartSLOduration=3.235814774 podStartE2EDuration="6.209114931s" podCreationTimestamp="2025-10-10 07:03:29 +0000 UTC" firstStartedPulling="2025-10-10 07:03:31.173012021 +0000 UTC m=+738.242603262" lastFinishedPulling="2025-10-10 07:03:34.146312158 +0000 UTC m=+741.215903419" observedRunningTime="2025-10-10 07:03:35.205736241 +0000 UTC m=+742.275327502" watchObservedRunningTime="2025-10-10 07:03:35.209114931 +0000 UTC m=+742.278706172" Oct 10 07:03:36 crc kubenswrapper[4732]: I1010 07:03:36.196088 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" event={"ID":"7a16ea47-04cf-4b90-8380-35ad716c299c","Type":"ContainerStarted","Data":"4d5a1efc40e7d833c54ddd1d3d0ec960fb5149380234179e853b051a9f9dbbbd"} Oct 10 07:03:36 crc kubenswrapper[4732]: I1010 07:03:36.212954 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-kd86h" podStartSLOduration=2.426338694 podStartE2EDuration="7.212931973s" podCreationTimestamp="2025-10-10 07:03:29 +0000 UTC" firstStartedPulling="2025-10-10 07:03:30.438532826 +0000 UTC m=+737.508124067" lastFinishedPulling="2025-10-10 07:03:35.225126115 +0000 UTC m=+742.294717346" observedRunningTime="2025-10-10 07:03:36.212467141 +0000 UTC m=+743.282058392" watchObservedRunningTime="2025-10-10 07:03:36.212931973 +0000 UTC m=+743.282523214" Oct 10 07:03:40 crc kubenswrapper[4732]: I1010 07:03:40.227427 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hnbqn" Oct 10 07:03:40 crc kubenswrapper[4732]: I1010 07:03:40.571184 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:40 crc kubenswrapper[4732]: I1010 07:03:40.571470 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:40 crc kubenswrapper[4732]: I1010 07:03:40.575890 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:41 crc kubenswrapper[4732]: I1010 07:03:41.242615 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f555b5bf4-znpfr" Oct 10 07:03:41 crc kubenswrapper[4732]: I1010 07:03:41.302369 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kg7gq"] Oct 10 07:03:50 crc kubenswrapper[4732]: I1010 07:03:50.795891 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-7f9qv" Oct 10 07:03:52 crc kubenswrapper[4732]: I1010 07:03:52.185055 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6srp"] Oct 10 07:03:52 crc kubenswrapper[4732]: I1010 07:03:52.185722 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" podUID="e8b58414-93da-4fc9-904b-1886401e00c8" containerName="controller-manager" containerID="cri-o://06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391" gracePeriod=30 Oct 10 07:03:52 crc kubenswrapper[4732]: I1010 07:03:52.292042 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p"] Oct 10 07:03:52 crc kubenswrapper[4732]: I1010 07:03:52.292292 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" podUID="76679b84-27e7-4a6a-b904-f399c9b7eb8d" containerName="route-controller-manager" containerID="cri-o://adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2" gracePeriod=30 Oct 10 07:03:52 crc kubenswrapper[4732]: E1010 07:03:52.341619 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8b58414_93da_4fc9_904b_1886401e00c8.slice/crio-06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.030501 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.129768 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.150484 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4thm\" (UniqueName: \"kubernetes.io/projected/e8b58414-93da-4fc9-904b-1886401e00c8-kube-api-access-z4thm\") pod \"e8b58414-93da-4fc9-904b-1886401e00c8\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.150532 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-config\") pod \"e8b58414-93da-4fc9-904b-1886401e00c8\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.150552 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-proxy-ca-bundles\") pod \"e8b58414-93da-4fc9-904b-1886401e00c8\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.150612 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-client-ca\") pod \"e8b58414-93da-4fc9-904b-1886401e00c8\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.150724 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b58414-93da-4fc9-904b-1886401e00c8-serving-cert\") pod \"e8b58414-93da-4fc9-904b-1886401e00c8\" (UID: \"e8b58414-93da-4fc9-904b-1886401e00c8\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.151994 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-config" (OuterVolumeSpecName: "config") pod "e8b58414-93da-4fc9-904b-1886401e00c8" (UID: "e8b58414-93da-4fc9-904b-1886401e00c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.152429 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e8b58414-93da-4fc9-904b-1886401e00c8" (UID: "e8b58414-93da-4fc9-904b-1886401e00c8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.152723 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "e8b58414-93da-4fc9-904b-1886401e00c8" (UID: "e8b58414-93da-4fc9-904b-1886401e00c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.156829 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b58414-93da-4fc9-904b-1886401e00c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e8b58414-93da-4fc9-904b-1886401e00c8" (UID: "e8b58414-93da-4fc9-904b-1886401e00c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.157113 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b58414-93da-4fc9-904b-1886401e00c8-kube-api-access-z4thm" (OuterVolumeSpecName: "kube-api-access-z4thm") pod "e8b58414-93da-4fc9-904b-1886401e00c8" (UID: "e8b58414-93da-4fc9-904b-1886401e00c8"). InnerVolumeSpecName "kube-api-access-z4thm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252369 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-config\") pod \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252469 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76679b84-27e7-4a6a-b904-f399c9b7eb8d-serving-cert\") pod \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-client-ca\") pod \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252603 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9xr4\" (UniqueName: \"kubernetes.io/projected/76679b84-27e7-4a6a-b904-f399c9b7eb8d-kube-api-access-h9xr4\") pod \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\" (UID: \"76679b84-27e7-4a6a-b904-f399c9b7eb8d\") " Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252849 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8b58414-93da-4fc9-904b-1886401e00c8-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252864 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4thm\" (UniqueName: \"kubernetes.io/projected/e8b58414-93da-4fc9-904b-1886401e00c8-kube-api-access-z4thm\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252876 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252886 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.252896 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e8b58414-93da-4fc9-904b-1886401e00c8-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.253353 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "76679b84-27e7-4a6a-b904-f399c9b7eb8d" (UID: "76679b84-27e7-4a6a-b904-f399c9b7eb8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.253977 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-config" (OuterVolumeSpecName: "config") pod "76679b84-27e7-4a6a-b904-f399c9b7eb8d" (UID: "76679b84-27e7-4a6a-b904-f399c9b7eb8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.256526 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76679b84-27e7-4a6a-b904-f399c9b7eb8d-kube-api-access-h9xr4" (OuterVolumeSpecName: "kube-api-access-h9xr4") pod "76679b84-27e7-4a6a-b904-f399c9b7eb8d" (UID: "76679b84-27e7-4a6a-b904-f399c9b7eb8d"). InnerVolumeSpecName "kube-api-access-h9xr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.256543 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76679b84-27e7-4a6a-b904-f399c9b7eb8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76679b84-27e7-4a6a-b904-f399c9b7eb8d" (UID: "76679b84-27e7-4a6a-b904-f399c9b7eb8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.302015 4732 generic.go:334] "Generic (PLEG): container finished" podID="76679b84-27e7-4a6a-b904-f399c9b7eb8d" containerID="adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2" exitCode=0 Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.302088 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.302113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" event={"ID":"76679b84-27e7-4a6a-b904-f399c9b7eb8d","Type":"ContainerDied","Data":"adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2"} Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.302157 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p" event={"ID":"76679b84-27e7-4a6a-b904-f399c9b7eb8d","Type":"ContainerDied","Data":"583b9d31e50ae9fbcc8a1d3b53bf53664a32c7f06a2c7058c4114e5f3ac48563"} Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.302178 4732 scope.go:117] "RemoveContainer" containerID="adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.305631 4732 generic.go:334] "Generic (PLEG): container finished" podID="e8b58414-93da-4fc9-904b-1886401e00c8" containerID="06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391" exitCode=0 Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.305670 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" event={"ID":"e8b58414-93da-4fc9-904b-1886401e00c8","Type":"ContainerDied","Data":"06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391"} Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.305719 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.305727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z6srp" event={"ID":"e8b58414-93da-4fc9-904b-1886401e00c8","Type":"ContainerDied","Data":"d68157c85424df9af606f9d206aecc46b259202642c4ac1adb5def9fb98fa6bd"} Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.364849 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-client-ca\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.364888 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9xr4\" (UniqueName: \"kubernetes.io/projected/76679b84-27e7-4a6a-b904-f399c9b7eb8d-kube-api-access-h9xr4\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.364904 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76679b84-27e7-4a6a-b904-f399c9b7eb8d-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.364914 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76679b84-27e7-4a6a-b904-f399c9b7eb8d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.375602 4732 scope.go:117] "RemoveContainer" containerID="adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2" Oct 10 07:03:53 crc kubenswrapper[4732]: E1010 07:03:53.376666 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2\": container with ID starting with adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2 not found: ID does not exist" containerID="adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.376722 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2"} err="failed to get container status \"adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2\": rpc error: code = NotFound desc = could not find container \"adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2\": container with ID starting with adacd40af7aa5a18fea53e61cabe1444dc1fb5cb19fa88ad92588151b7f94ee2 not found: ID does not exist" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.376749 4732 scope.go:117] "RemoveContainer" containerID="06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.389489 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p"] Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.393531 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-l858p"] Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.402173 4732 scope.go:117] "RemoveContainer" containerID="06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391" Oct 10 07:03:53 crc kubenswrapper[4732]: E1010 07:03:53.404834 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391\": container with ID starting with 06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391 not found: ID does not exist" containerID="06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.404914 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391"} err="failed to get container status \"06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391\": rpc error: code = NotFound desc = could not find container \"06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391\": container with ID starting with 06837b0829b50f85231a5e3a1486c9278599270ba28eba4744e2fb953db05391 not found: ID does not exist" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.406364 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6srp"] Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.411463 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z6srp"] Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.667947 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76679b84-27e7-4a6a-b904-f399c9b7eb8d" path="/var/lib/kubelet/pods/76679b84-27e7-4a6a-b904-f399c9b7eb8d/volumes" Oct 10 07:03:53 crc kubenswrapper[4732]: I1010 07:03:53.668872 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b58414-93da-4fc9-904b-1886401e00c8" path="/var/lib/kubelet/pods/e8b58414-93da-4fc9-904b-1886401e00c8/volumes" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.045061 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx"] Oct 10 07:03:54 crc kubenswrapper[4732]: E1010 07:03:54.045348 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76679b84-27e7-4a6a-b904-f399c9b7eb8d" containerName="route-controller-manager" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.045380 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76679b84-27e7-4a6a-b904-f399c9b7eb8d" containerName="route-controller-manager" Oct 10 07:03:54 crc kubenswrapper[4732]: E1010 07:03:54.045405 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b58414-93da-4fc9-904b-1886401e00c8" containerName="controller-manager" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.045414 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b58414-93da-4fc9-904b-1886401e00c8" containerName="controller-manager" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.045502 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b58414-93da-4fc9-904b-1886401e00c8" containerName="controller-manager" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.045521 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="76679b84-27e7-4a6a-b904-f399c9b7eb8d" containerName="route-controller-manager" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.046760 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.047429 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f"] Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.048054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.051061 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.051550 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.052061 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.052801 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.052924 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.053020 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.053123 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.053260 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.053560 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.053980 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.054206 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.057218 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.058202 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.058432 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f"] Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.061847 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx"] Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176242 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zwn9\" (UniqueName: \"kubernetes.io/projected/928bfb81-a1ed-4bbf-81c4-ee8b84634958-kube-api-access-4zwn9\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176309 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-config\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176340 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928bfb81-a1ed-4bbf-81c4-ee8b84634958-config\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176365 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/928bfb81-a1ed-4bbf-81c4-ee8b84634958-client-ca\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176395 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-client-ca\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176418 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b79m2\" (UniqueName: \"kubernetes.io/projected/d10414d0-05f2-440f-a6b8-f96029df0e7b-kube-api-access-b79m2\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176543 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/928bfb81-a1ed-4bbf-81c4-ee8b84634958-serving-cert\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176621 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10414d0-05f2-440f-a6b8-f96029df0e7b-serving-cert\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.176759 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-proxy-ca-bundles\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278069 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-proxy-ca-bundles\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zwn9\" (UniqueName: \"kubernetes.io/projected/928bfb81-a1ed-4bbf-81c4-ee8b84634958-kube-api-access-4zwn9\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278184 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-config\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278234 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928bfb81-a1ed-4bbf-81c4-ee8b84634958-config\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278257 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/928bfb81-a1ed-4bbf-81c4-ee8b84634958-client-ca\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-client-ca\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278333 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b79m2\" (UniqueName: \"kubernetes.io/projected/d10414d0-05f2-440f-a6b8-f96029df0e7b-kube-api-access-b79m2\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278393 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/928bfb81-a1ed-4bbf-81c4-ee8b84634958-serving-cert\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.278427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10414d0-05f2-440f-a6b8-f96029df0e7b-serving-cert\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.279604 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928bfb81-a1ed-4bbf-81c4-ee8b84634958-config\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.279903 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-config\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.279918 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/928bfb81-a1ed-4bbf-81c4-ee8b84634958-client-ca\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.279940 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-client-ca\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.281209 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d10414d0-05f2-440f-a6b8-f96029df0e7b-proxy-ca-bundles\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.282943 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d10414d0-05f2-440f-a6b8-f96029df0e7b-serving-cert\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.282961 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/928bfb81-a1ed-4bbf-81c4-ee8b84634958-serving-cert\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.298625 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b79m2\" (UniqueName: \"kubernetes.io/projected/d10414d0-05f2-440f-a6b8-f96029df0e7b-kube-api-access-b79m2\") pod \"controller-manager-7ccd8bd974-wbn5f\" (UID: \"d10414d0-05f2-440f-a6b8-f96029df0e7b\") " pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.300194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zwn9\" (UniqueName: \"kubernetes.io/projected/928bfb81-a1ed-4bbf-81c4-ee8b84634958-kube-api-access-4zwn9\") pod \"route-controller-manager-6666f795c8-gb7sx\" (UID: \"928bfb81-a1ed-4bbf-81c4-ee8b84634958\") " pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.380330 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.388602 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.642455 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f"] Oct 10 07:03:54 crc kubenswrapper[4732]: W1010 07:03:54.647259 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd10414d0_05f2_440f_a6b8_f96029df0e7b.slice/crio-882d425f1eafa843d5cb321560243914c7307c32ee1043dd35e4a7a1de51f209 WatchSource:0}: Error finding container 882d425f1eafa843d5cb321560243914c7307c32ee1043dd35e4a7a1de51f209: Status 404 returned error can't find the container with id 882d425f1eafa843d5cb321560243914c7307c32ee1043dd35e4a7a1de51f209 Oct 10 07:03:54 crc kubenswrapper[4732]: I1010 07:03:54.796740 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx"] Oct 10 07:03:54 crc kubenswrapper[4732]: W1010 07:03:54.803950 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928bfb81_a1ed_4bbf_81c4_ee8b84634958.slice/crio-b5a7cacc257130dbfae1085378316156a3a0ff52975b63ae6e63e6bda5bd8453 WatchSource:0}: Error finding container b5a7cacc257130dbfae1085378316156a3a0ff52975b63ae6e63e6bda5bd8453: Status 404 returned error can't find the container with id b5a7cacc257130dbfae1085378316156a3a0ff52975b63ae6e63e6bda5bd8453 Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.322996 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" event={"ID":"928bfb81-a1ed-4bbf-81c4-ee8b84634958","Type":"ContainerStarted","Data":"8c515ea48241b09f5c26f7993c9f462d321d80e5a153e4b8d1baafc320f31b39"} Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.323041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" event={"ID":"928bfb81-a1ed-4bbf-81c4-ee8b84634958","Type":"ContainerStarted","Data":"b5a7cacc257130dbfae1085378316156a3a0ff52975b63ae6e63e6bda5bd8453"} Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.323297 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.324526 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" event={"ID":"d10414d0-05f2-440f-a6b8-f96029df0e7b","Type":"ContainerStarted","Data":"9509bd3d446575aea75aba39516b32bd04b619168692dafee242944a3a7a1f19"} Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.324561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" event={"ID":"d10414d0-05f2-440f-a6b8-f96029df0e7b","Type":"ContainerStarted","Data":"882d425f1eafa843d5cb321560243914c7307c32ee1043dd35e4a7a1de51f209"} Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.324799 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.332012 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.351166 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" podStartSLOduration=3.351067004 podStartE2EDuration="3.351067004s" podCreationTimestamp="2025-10-10 07:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:03:55.348668751 +0000 UTC m=+762.418260002" watchObservedRunningTime="2025-10-10 07:03:55.351067004 +0000 UTC m=+762.420658255" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.355486 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.355543 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.355586 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.356293 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53a6572dbd94b6a842e7d21f0e7b30a3d933ba27c7ff2433ef57f5a4d47b6c8d"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.356354 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://53a6572dbd94b6a842e7d21f0e7b30a3d933ba27c7ff2433ef57f5a4d47b6c8d" gracePeriod=600 Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.370176 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7ccd8bd974-wbn5f" podStartSLOduration=3.370153789 podStartE2EDuration="3.370153789s" podCreationTimestamp="2025-10-10 07:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:03:55.369021159 +0000 UTC m=+762.438612410" watchObservedRunningTime="2025-10-10 07:03:55.370153789 +0000 UTC m=+762.439745030" Oct 10 07:03:55 crc kubenswrapper[4732]: I1010 07:03:55.535925 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6666f795c8-gb7sx" Oct 10 07:03:56 crc kubenswrapper[4732]: I1010 07:03:56.334245 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="53a6572dbd94b6a842e7d21f0e7b30a3d933ba27c7ff2433ef57f5a4d47b6c8d" exitCode=0 Oct 10 07:03:56 crc kubenswrapper[4732]: I1010 07:03:56.334342 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"53a6572dbd94b6a842e7d21f0e7b30a3d933ba27c7ff2433ef57f5a4d47b6c8d"} Oct 10 07:03:56 crc kubenswrapper[4732]: I1010 07:03:56.334607 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"91a09d30e877ca33916c77e5509ae9d7f46220996d3447c91784df288ae8e1b0"} Oct 10 07:03:56 crc kubenswrapper[4732]: I1010 07:03:56.334635 4732 scope.go:117] "RemoveContainer" containerID="0f1dcf57554420a877507004f1258c39575e3a24cbe7c5b8ca3aeabfcdb7b710" Oct 10 07:03:58 crc kubenswrapper[4732]: I1010 07:03:58.936159 4732 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.134127 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt"] Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.135736 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.137441 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.147251 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt"] Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.263750 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.263797 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqxz5\" (UniqueName: \"kubernetes.io/projected/2edbc338-d144-4bd8-a06a-3ea0537ba513-kube-api-access-nqxz5\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.263843 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.348829 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-kg7gq" podUID="e7a62711-6cb6-4867-a232-8b8b043faa74" containerName="console" containerID="cri-o://978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8" gracePeriod=15 Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.365305 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqxz5\" (UniqueName: \"kubernetes.io/projected/2edbc338-d144-4bd8-a06a-3ea0537ba513-kube-api-access-nqxz5\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.365406 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.365460 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.366009 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.366126 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.387766 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqxz5\" (UniqueName: \"kubernetes.io/projected/2edbc338-d144-4bd8-a06a-3ea0537ba513-kube-api-access-nqxz5\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.451402 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:06 crc kubenswrapper[4732]: I1010 07:04:06.897747 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt"] Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.055146 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kg7gq_e7a62711-6cb6-4867-a232-8b8b043faa74/console/0.log" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.055221 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.191042 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-trusted-ca-bundle\") pod \"e7a62711-6cb6-4867-a232-8b8b043faa74\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.191474 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-service-ca\") pod \"e7a62711-6cb6-4867-a232-8b8b043faa74\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.191495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-console-config\") pod \"e7a62711-6cb6-4867-a232-8b8b043faa74\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.191530 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj29j\" (UniqueName: \"kubernetes.io/projected/e7a62711-6cb6-4867-a232-8b8b043faa74-kube-api-access-bj29j\") pod \"e7a62711-6cb6-4867-a232-8b8b043faa74\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.191614 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-serving-cert\") pod \"e7a62711-6cb6-4867-a232-8b8b043faa74\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.191747 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-oauth-config\") pod \"e7a62711-6cb6-4867-a232-8b8b043faa74\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.191776 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-oauth-serving-cert\") pod \"e7a62711-6cb6-4867-a232-8b8b043faa74\" (UID: \"e7a62711-6cb6-4867-a232-8b8b043faa74\") " Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.193241 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-console-config" (OuterVolumeSpecName: "console-config") pod "e7a62711-6cb6-4867-a232-8b8b043faa74" (UID: "e7a62711-6cb6-4867-a232-8b8b043faa74"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.193307 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-service-ca" (OuterVolumeSpecName: "service-ca") pod "e7a62711-6cb6-4867-a232-8b8b043faa74" (UID: "e7a62711-6cb6-4867-a232-8b8b043faa74"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.193587 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e7a62711-6cb6-4867-a232-8b8b043faa74" (UID: "e7a62711-6cb6-4867-a232-8b8b043faa74"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.193743 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e7a62711-6cb6-4867-a232-8b8b043faa74" (UID: "e7a62711-6cb6-4867-a232-8b8b043faa74"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.199111 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e7a62711-6cb6-4867-a232-8b8b043faa74" (UID: "e7a62711-6cb6-4867-a232-8b8b043faa74"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.199145 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a62711-6cb6-4867-a232-8b8b043faa74-kube-api-access-bj29j" (OuterVolumeSpecName: "kube-api-access-bj29j") pod "e7a62711-6cb6-4867-a232-8b8b043faa74" (UID: "e7a62711-6cb6-4867-a232-8b8b043faa74"). InnerVolumeSpecName "kube-api-access-bj29j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.199277 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e7a62711-6cb6-4867-a232-8b8b043faa74" (UID: "e7a62711-6cb6-4867-a232-8b8b043faa74"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.292884 4732 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.292929 4732 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.292942 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.292954 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-service-ca\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.292966 4732 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7a62711-6cb6-4867-a232-8b8b043faa74-console-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.292977 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj29j\" (UniqueName: \"kubernetes.io/projected/e7a62711-6cb6-4867-a232-8b8b043faa74-kube-api-access-bj29j\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.292994 4732 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7a62711-6cb6-4867-a232-8b8b043faa74-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.424753 4732 generic.go:334] "Generic (PLEG): container finished" podID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerID="513de1e412b9a23958f6ccfc2885691c819987e3031e3896c6ed2677f12ee6c8" exitCode=0 Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.424971 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" event={"ID":"2edbc338-d144-4bd8-a06a-3ea0537ba513","Type":"ContainerDied","Data":"513de1e412b9a23958f6ccfc2885691c819987e3031e3896c6ed2677f12ee6c8"} Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.425060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" event={"ID":"2edbc338-d144-4bd8-a06a-3ea0537ba513","Type":"ContainerStarted","Data":"b9142976eeb2442c3e45034d76b034bb542883dc49716c93a5743f13b6ec0636"} Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.428372 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-kg7gq_e7a62711-6cb6-4867-a232-8b8b043faa74/console/0.log" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.428420 4732 generic.go:334] "Generic (PLEG): container finished" podID="e7a62711-6cb6-4867-a232-8b8b043faa74" containerID="978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8" exitCode=2 Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.428457 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kg7gq" event={"ID":"e7a62711-6cb6-4867-a232-8b8b043faa74","Type":"ContainerDied","Data":"978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8"} Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.428491 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-kg7gq" event={"ID":"e7a62711-6cb6-4867-a232-8b8b043faa74","Type":"ContainerDied","Data":"bf5eda4ec642e56e7abf1be6d56f32c7819c21d02b880999b2cca11b594cdedc"} Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.428511 4732 scope.go:117] "RemoveContainer" containerID="978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.428652 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-kg7gq" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.464038 4732 scope.go:117] "RemoveContainer" containerID="978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8" Oct 10 07:04:07 crc kubenswrapper[4732]: E1010 07:04:07.465518 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8\": container with ID starting with 978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8 not found: ID does not exist" containerID="978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.465563 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8"} err="failed to get container status \"978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8\": rpc error: code = NotFound desc = could not find container \"978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8\": container with ID starting with 978b838310fa66a45287bbae2d60ca61f3653ba1ec9b07c4e37efef8a14a98b8 not found: ID does not exist" Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.468514 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-kg7gq"] Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.472016 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-kg7gq"] Oct 10 07:04:07 crc kubenswrapper[4732]: I1010 07:04:07.684498 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a62711-6cb6-4867-a232-8b8b043faa74" path="/var/lib/kubelet/pods/e7a62711-6cb6-4867-a232-8b8b043faa74/volumes" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.467093 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qg9gq"] Oct 10 07:04:09 crc kubenswrapper[4732]: E1010 07:04:09.467377 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a62711-6cb6-4867-a232-8b8b043faa74" containerName="console" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.467391 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a62711-6cb6-4867-a232-8b8b043faa74" containerName="console" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.467519 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a62711-6cb6-4867-a232-8b8b043faa74" containerName="console" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.469073 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.480069 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qg9gq"] Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.624260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-utilities\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.624300 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-catalog-content\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.624326 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lwc8\" (UniqueName: \"kubernetes.io/projected/79925f1a-3bdb-4d04-aa93-2cd245fc9830-kube-api-access-8lwc8\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.725377 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-utilities\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.725431 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-catalog-content\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.725468 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lwc8\" (UniqueName: \"kubernetes.io/projected/79925f1a-3bdb-4d04-aa93-2cd245fc9830-kube-api-access-8lwc8\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.726194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-utilities\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.726235 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-catalog-content\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.743331 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lwc8\" (UniqueName: \"kubernetes.io/projected/79925f1a-3bdb-4d04-aa93-2cd245fc9830-kube-api-access-8lwc8\") pod \"redhat-operators-qg9gq\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:09 crc kubenswrapper[4732]: I1010 07:04:09.783797 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:10 crc kubenswrapper[4732]: I1010 07:04:10.264058 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qg9gq"] Oct 10 07:04:10 crc kubenswrapper[4732]: W1010 07:04:10.269970 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79925f1a_3bdb_4d04_aa93_2cd245fc9830.slice/crio-49b492c36646373717357cc7efcaefaa404cdfb8dc21a437b9ac3c42d9875f1b WatchSource:0}: Error finding container 49b492c36646373717357cc7efcaefaa404cdfb8dc21a437b9ac3c42d9875f1b: Status 404 returned error can't find the container with id 49b492c36646373717357cc7efcaefaa404cdfb8dc21a437b9ac3c42d9875f1b Oct 10 07:04:10 crc kubenswrapper[4732]: I1010 07:04:10.449224 4732 generic.go:334] "Generic (PLEG): container finished" podID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerID="1f205b02d8930e72793e7a3eb7fec6467728652838cd9487d11e48d5033e717c" exitCode=0 Oct 10 07:04:10 crc kubenswrapper[4732]: I1010 07:04:10.449329 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" event={"ID":"2edbc338-d144-4bd8-a06a-3ea0537ba513","Type":"ContainerDied","Data":"1f205b02d8930e72793e7a3eb7fec6467728652838cd9487d11e48d5033e717c"} Oct 10 07:04:10 crc kubenswrapper[4732]: I1010 07:04:10.452393 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg9gq" event={"ID":"79925f1a-3bdb-4d04-aa93-2cd245fc9830","Type":"ContainerStarted","Data":"aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377"} Oct 10 07:04:10 crc kubenswrapper[4732]: I1010 07:04:10.452423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg9gq" event={"ID":"79925f1a-3bdb-4d04-aa93-2cd245fc9830","Type":"ContainerStarted","Data":"49b492c36646373717357cc7efcaefaa404cdfb8dc21a437b9ac3c42d9875f1b"} Oct 10 07:04:11 crc kubenswrapper[4732]: I1010 07:04:11.460640 4732 generic.go:334] "Generic (PLEG): container finished" podID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerID="aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377" exitCode=0 Oct 10 07:04:11 crc kubenswrapper[4732]: I1010 07:04:11.460804 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg9gq" event={"ID":"79925f1a-3bdb-4d04-aa93-2cd245fc9830","Type":"ContainerDied","Data":"aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377"} Oct 10 07:04:11 crc kubenswrapper[4732]: I1010 07:04:11.466334 4732 generic.go:334] "Generic (PLEG): container finished" podID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerID="c3d2d2d10468c951bcf0c2a75d430cd3c64ed6cba5b5f260079499803fcf65b8" exitCode=0 Oct 10 07:04:11 crc kubenswrapper[4732]: I1010 07:04:11.466378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" event={"ID":"2edbc338-d144-4bd8-a06a-3ea0537ba513","Type":"ContainerDied","Data":"c3d2d2d10468c951bcf0c2a75d430cd3c64ed6cba5b5f260079499803fcf65b8"} Oct 10 07:04:12 crc kubenswrapper[4732]: I1010 07:04:12.797251 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:12 crc kubenswrapper[4732]: I1010 07:04:12.984069 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-util\") pod \"2edbc338-d144-4bd8-a06a-3ea0537ba513\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " Oct 10 07:04:12 crc kubenswrapper[4732]: I1010 07:04:12.984152 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqxz5\" (UniqueName: \"kubernetes.io/projected/2edbc338-d144-4bd8-a06a-3ea0537ba513-kube-api-access-nqxz5\") pod \"2edbc338-d144-4bd8-a06a-3ea0537ba513\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " Oct 10 07:04:12 crc kubenswrapper[4732]: I1010 07:04:12.984183 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-bundle\") pod \"2edbc338-d144-4bd8-a06a-3ea0537ba513\" (UID: \"2edbc338-d144-4bd8-a06a-3ea0537ba513\") " Oct 10 07:04:12 crc kubenswrapper[4732]: I1010 07:04:12.985296 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-bundle" (OuterVolumeSpecName: "bundle") pod "2edbc338-d144-4bd8-a06a-3ea0537ba513" (UID: "2edbc338-d144-4bd8-a06a-3ea0537ba513"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:04:12 crc kubenswrapper[4732]: I1010 07:04:12.996404 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2edbc338-d144-4bd8-a06a-3ea0537ba513-kube-api-access-nqxz5" (OuterVolumeSpecName: "kube-api-access-nqxz5") pod "2edbc338-d144-4bd8-a06a-3ea0537ba513" (UID: "2edbc338-d144-4bd8-a06a-3ea0537ba513"). InnerVolumeSpecName "kube-api-access-nqxz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.069736 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-util" (OuterVolumeSpecName: "util") pod "2edbc338-d144-4bd8-a06a-3ea0537ba513" (UID: "2edbc338-d144-4bd8-a06a-3ea0537ba513"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.086095 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-util\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.086916 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqxz5\" (UniqueName: \"kubernetes.io/projected/2edbc338-d144-4bd8-a06a-3ea0537ba513-kube-api-access-nqxz5\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.087010 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2edbc338-d144-4bd8-a06a-3ea0537ba513-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.482599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" event={"ID":"2edbc338-d144-4bd8-a06a-3ea0537ba513","Type":"ContainerDied","Data":"b9142976eeb2442c3e45034d76b034bb542883dc49716c93a5743f13b6ec0636"} Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.482642 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9142976eeb2442c3e45034d76b034bb542883dc49716c93a5743f13b6ec0636" Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.483131 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt" Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.484462 4732 generic.go:334] "Generic (PLEG): container finished" podID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerID="5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58" exitCode=0 Oct 10 07:04:13 crc kubenswrapper[4732]: I1010 07:04:13.484496 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg9gq" event={"ID":"79925f1a-3bdb-4d04-aa93-2cd245fc9830","Type":"ContainerDied","Data":"5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58"} Oct 10 07:04:14 crc kubenswrapper[4732]: I1010 07:04:14.492599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg9gq" event={"ID":"79925f1a-3bdb-4d04-aa93-2cd245fc9830","Type":"ContainerStarted","Data":"114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5"} Oct 10 07:04:14 crc kubenswrapper[4732]: I1010 07:04:14.515645 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qg9gq" podStartSLOduration=3.008867453 podStartE2EDuration="5.515621992s" podCreationTimestamp="2025-10-10 07:04:09 +0000 UTC" firstStartedPulling="2025-10-10 07:04:11.462228698 +0000 UTC m=+778.531819939" lastFinishedPulling="2025-10-10 07:04:13.968983247 +0000 UTC m=+781.038574478" observedRunningTime="2025-10-10 07:04:14.510651239 +0000 UTC m=+781.580242540" watchObservedRunningTime="2025-10-10 07:04:14.515621992 +0000 UTC m=+781.585213273" Oct 10 07:04:19 crc kubenswrapper[4732]: I1010 07:04:19.784672 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:19 crc kubenswrapper[4732]: I1010 07:04:19.786640 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:19 crc kubenswrapper[4732]: I1010 07:04:19.823854 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:20 crc kubenswrapper[4732]: I1010 07:04:20.562286 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:23 crc kubenswrapper[4732]: I1010 07:04:23.258884 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qg9gq"] Oct 10 07:04:23 crc kubenswrapper[4732]: I1010 07:04:23.259402 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qg9gq" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="registry-server" containerID="cri-o://114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5" gracePeriod=2 Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.224188 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-686c95bfd-424n7"] Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.224712 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerName="util" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.224732 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerName="util" Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.224745 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerName="extract" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.224754 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerName="extract" Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.224778 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerName="pull" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.224785 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerName="pull" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.224885 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2edbc338-d144-4bd8-a06a-3ea0537ba513" containerName="extract" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.225236 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.227736 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.227962 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.228106 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-njkkq" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.228330 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.229734 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.250965 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-686c95bfd-424n7"] Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.317808 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.354167 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-apiservice-cert\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.354271 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-webhook-cert\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.354308 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmcj\" (UniqueName: \"kubernetes.io/projected/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-kube-api-access-brmcj\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.454197 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts"] Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.454480 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="registry-server" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.454499 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="registry-server" Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.454522 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="extract-utilities" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.454528 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="extract-utilities" Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.454538 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="extract-content" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.454547 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="extract-content" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.454656 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerName="registry-server" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.455117 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-catalog-content\") pod \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.455180 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-utilities\") pod \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.455191 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.455258 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lwc8\" (UniqueName: \"kubernetes.io/projected/79925f1a-3bdb-4d04-aa93-2cd245fc9830-kube-api-access-8lwc8\") pod \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\" (UID: \"79925f1a-3bdb-4d04-aa93-2cd245fc9830\") " Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.455554 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-webhook-cert\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.455601 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmcj\" (UniqueName: \"kubernetes.io/projected/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-kube-api-access-brmcj\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.455714 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-apiservice-cert\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.456085 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-utilities" (OuterVolumeSpecName: "utilities") pod "79925f1a-3bdb-4d04-aa93-2cd245fc9830" (UID: "79925f1a-3bdb-4d04-aa93-2cd245fc9830"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.458818 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-f4fsn" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.459134 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.459322 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.463919 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79925f1a-3bdb-4d04-aa93-2cd245fc9830-kube-api-access-8lwc8" (OuterVolumeSpecName: "kube-api-access-8lwc8") pod "79925f1a-3bdb-4d04-aa93-2cd245fc9830" (UID: "79925f1a-3bdb-4d04-aa93-2cd245fc9830"). InnerVolumeSpecName "kube-api-access-8lwc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.467431 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-apiservice-cert\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.482234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-webhook-cert\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.506647 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts"] Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.515493 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmcj\" (UniqueName: \"kubernetes.io/projected/d510ee8f-515b-4088-8bc2-afb87f7ccf6e-kube-api-access-brmcj\") pod \"metallb-operator-controller-manager-686c95bfd-424n7\" (UID: \"d510ee8f-515b-4088-8bc2-afb87f7ccf6e\") " pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.550818 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79925f1a-3bdb-4d04-aa93-2cd245fc9830" (UID: "79925f1a-3bdb-4d04-aa93-2cd245fc9830"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.554204 4732 generic.go:334] "Generic (PLEG): container finished" podID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" containerID="114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5" exitCode=0 Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.554258 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg9gq" event={"ID":"79925f1a-3bdb-4d04-aa93-2cd245fc9830","Type":"ContainerDied","Data":"114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5"} Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.554290 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qg9gq" event={"ID":"79925f1a-3bdb-4d04-aa93-2cd245fc9830","Type":"ContainerDied","Data":"49b492c36646373717357cc7efcaefaa404cdfb8dc21a437b9ac3c42d9875f1b"} Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.554311 4732 scope.go:117] "RemoveContainer" containerID="114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.554474 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qg9gq" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.560040 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f75ce88-78e8-4e65-a7d5-eeb19c049313-webhook-cert\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.560085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fht54\" (UniqueName: \"kubernetes.io/projected/7f75ce88-78e8-4e65-a7d5-eeb19c049313-kube-api-access-fht54\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.560114 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f75ce88-78e8-4e65-a7d5-eeb19c049313-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.560159 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.560171 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79925f1a-3bdb-4d04-aa93-2cd245fc9830-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.560182 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lwc8\" (UniqueName: \"kubernetes.io/projected/79925f1a-3bdb-4d04-aa93-2cd245fc9830-kube-api-access-8lwc8\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.584915 4732 scope.go:117] "RemoveContainer" containerID="5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.602841 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qg9gq"] Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.617139 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.635940 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qg9gq"] Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.662348 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f75ce88-78e8-4e65-a7d5-eeb19c049313-webhook-cert\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.662409 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fht54\" (UniqueName: \"kubernetes.io/projected/7f75ce88-78e8-4e65-a7d5-eeb19c049313-kube-api-access-fht54\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.662455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f75ce88-78e8-4e65-a7d5-eeb19c049313-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.666230 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f75ce88-78e8-4e65-a7d5-eeb19c049313-webhook-cert\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.666380 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f75ce88-78e8-4e65-a7d5-eeb19c049313-apiservice-cert\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.670639 4732 scope.go:117] "RemoveContainer" containerID="aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.695498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fht54\" (UniqueName: \"kubernetes.io/projected/7f75ce88-78e8-4e65-a7d5-eeb19c049313-kube-api-access-fht54\") pod \"metallb-operator-webhook-server-6cf5cdccc4-5mpts\" (UID: \"7f75ce88-78e8-4e65-a7d5-eeb19c049313\") " pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.704150 4732 scope.go:117] "RemoveContainer" containerID="114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5" Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.706134 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5\": container with ID starting with 114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5 not found: ID does not exist" containerID="114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.706328 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5"} err="failed to get container status \"114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5\": rpc error: code = NotFound desc = could not find container \"114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5\": container with ID starting with 114583b9929eb3d66797435b5a6d59d807ae9153b9d48acfcca7f9fa0ec1f0c5 not found: ID does not exist" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.706468 4732 scope.go:117] "RemoveContainer" containerID="5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58" Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.707576 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58\": container with ID starting with 5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58 not found: ID does not exist" containerID="5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.707712 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58"} err="failed to get container status \"5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58\": rpc error: code = NotFound desc = could not find container \"5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58\": container with ID starting with 5c0bbb08e743a704e67e27fe560c4230b2077d6052d6eecb8d1b7e9dfb2aff58 not found: ID does not exist" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.707841 4732 scope.go:117] "RemoveContainer" containerID="aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377" Oct 10 07:04:24 crc kubenswrapper[4732]: E1010 07:04:24.709823 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377\": container with ID starting with aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377 not found: ID does not exist" containerID="aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.709869 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377"} err="failed to get container status \"aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377\": rpc error: code = NotFound desc = could not find container \"aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377\": container with ID starting with aad85d00a79a5b9e6c366d6a0a9813bc6acd419f19d9c5367042bc655ab78377 not found: ID does not exist" Oct 10 07:04:24 crc kubenswrapper[4732]: I1010 07:04:24.845473 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:25 crc kubenswrapper[4732]: I1010 07:04:25.150174 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-686c95bfd-424n7"] Oct 10 07:04:25 crc kubenswrapper[4732]: W1010 07:04:25.163648 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd510ee8f_515b_4088_8bc2_afb87f7ccf6e.slice/crio-8f1bd533f6c0d434221e7ef2b6862b0ea8968bc5f0dc6b9dadce17f971fb50f5 WatchSource:0}: Error finding container 8f1bd533f6c0d434221e7ef2b6862b0ea8968bc5f0dc6b9dadce17f971fb50f5: Status 404 returned error can't find the container with id 8f1bd533f6c0d434221e7ef2b6862b0ea8968bc5f0dc6b9dadce17f971fb50f5 Oct 10 07:04:25 crc kubenswrapper[4732]: I1010 07:04:25.316165 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts"] Oct 10 07:04:25 crc kubenswrapper[4732]: W1010 07:04:25.321085 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f75ce88_78e8_4e65_a7d5_eeb19c049313.slice/crio-5bae8e472ac3de590dd5b54dc0247b628485ddf1cb3c8df8785ac910515a022e WatchSource:0}: Error finding container 5bae8e472ac3de590dd5b54dc0247b628485ddf1cb3c8df8785ac910515a022e: Status 404 returned error can't find the container with id 5bae8e472ac3de590dd5b54dc0247b628485ddf1cb3c8df8785ac910515a022e Oct 10 07:04:25 crc kubenswrapper[4732]: I1010 07:04:25.561432 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" event={"ID":"7f75ce88-78e8-4e65-a7d5-eeb19c049313","Type":"ContainerStarted","Data":"5bae8e472ac3de590dd5b54dc0247b628485ddf1cb3c8df8785ac910515a022e"} Oct 10 07:04:25 crc kubenswrapper[4732]: I1010 07:04:25.563681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" event={"ID":"d510ee8f-515b-4088-8bc2-afb87f7ccf6e","Type":"ContainerStarted","Data":"8f1bd533f6c0d434221e7ef2b6862b0ea8968bc5f0dc6b9dadce17f971fb50f5"} Oct 10 07:04:25 crc kubenswrapper[4732]: I1010 07:04:25.666979 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79925f1a-3bdb-4d04-aa93-2cd245fc9830" path="/var/lib/kubelet/pods/79925f1a-3bdb-4d04-aa93-2cd245fc9830/volumes" Oct 10 07:04:32 crc kubenswrapper[4732]: I1010 07:04:32.607560 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" event={"ID":"7f75ce88-78e8-4e65-a7d5-eeb19c049313","Type":"ContainerStarted","Data":"fbac7fed99340c694a17c9fa730b612cde96162067701c7837262fe69086e5cb"} Oct 10 07:04:32 crc kubenswrapper[4732]: I1010 07:04:32.608221 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:32 crc kubenswrapper[4732]: I1010 07:04:32.609975 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" event={"ID":"d510ee8f-515b-4088-8bc2-afb87f7ccf6e","Type":"ContainerStarted","Data":"11b290af7b15870af3abc839d0581f1b593f85d7a56a569c26c79f488ca4c3a5"} Oct 10 07:04:32 crc kubenswrapper[4732]: I1010 07:04:32.610154 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:04:32 crc kubenswrapper[4732]: I1010 07:04:32.632365 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" podStartSLOduration=2.5490320349999998 podStartE2EDuration="8.632347658s" podCreationTimestamp="2025-10-10 07:04:24 +0000 UTC" firstStartedPulling="2025-10-10 07:04:25.324894798 +0000 UTC m=+792.394486039" lastFinishedPulling="2025-10-10 07:04:31.408210421 +0000 UTC m=+798.477801662" observedRunningTime="2025-10-10 07:04:32.628454104 +0000 UTC m=+799.698045345" watchObservedRunningTime="2025-10-10 07:04:32.632347658 +0000 UTC m=+799.701938899" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.370940 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" podStartSLOduration=14.151280199 podStartE2EDuration="20.370909864s" podCreationTimestamp="2025-10-10 07:04:24 +0000 UTC" firstStartedPulling="2025-10-10 07:04:25.169345542 +0000 UTC m=+792.238936783" lastFinishedPulling="2025-10-10 07:04:31.388975207 +0000 UTC m=+798.458566448" observedRunningTime="2025-10-10 07:04:32.659857883 +0000 UTC m=+799.729449144" watchObservedRunningTime="2025-10-10 07:04:44.370909864 +0000 UTC m=+811.440501105" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.374393 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6bpjf"] Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.376435 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.384916 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bpjf"] Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.444073 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-catalog-content\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.444373 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-utilities\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.444512 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qkz9\" (UniqueName: \"kubernetes.io/projected/83de961c-afaf-4b3d-8217-29c16843fe5c-kube-api-access-8qkz9\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.545452 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-catalog-content\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.545791 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-utilities\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.545930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qkz9\" (UniqueName: \"kubernetes.io/projected/83de961c-afaf-4b3d-8217-29c16843fe5c-kube-api-access-8qkz9\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.546176 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-catalog-content\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.546465 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-utilities\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.568069 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qkz9\" (UniqueName: \"kubernetes.io/projected/83de961c-afaf-4b3d-8217-29c16843fe5c-kube-api-access-8qkz9\") pod \"community-operators-6bpjf\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.704679 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:44 crc kubenswrapper[4732]: I1010 07:04:44.867035 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6cf5cdccc4-5mpts" Oct 10 07:04:45 crc kubenswrapper[4732]: I1010 07:04:45.267962 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bpjf"] Oct 10 07:04:45 crc kubenswrapper[4732]: W1010 07:04:45.280872 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83de961c_afaf_4b3d_8217_29c16843fe5c.slice/crio-f88091b410a6826211d5616688e1664c0e3506f182894d44c8c6da45ee43e90a WatchSource:0}: Error finding container f88091b410a6826211d5616688e1664c0e3506f182894d44c8c6da45ee43e90a: Status 404 returned error can't find the container with id f88091b410a6826211d5616688e1664c0e3506f182894d44c8c6da45ee43e90a Oct 10 07:04:45 crc kubenswrapper[4732]: I1010 07:04:45.699675 4732 generic.go:334] "Generic (PLEG): container finished" podID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerID="ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31" exitCode=0 Oct 10 07:04:45 crc kubenswrapper[4732]: I1010 07:04:45.699869 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpjf" event={"ID":"83de961c-afaf-4b3d-8217-29c16843fe5c","Type":"ContainerDied","Data":"ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31"} Oct 10 07:04:45 crc kubenswrapper[4732]: I1010 07:04:45.701016 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpjf" event={"ID":"83de961c-afaf-4b3d-8217-29c16843fe5c","Type":"ContainerStarted","Data":"f88091b410a6826211d5616688e1664c0e3506f182894d44c8c6da45ee43e90a"} Oct 10 07:04:47 crc kubenswrapper[4732]: I1010 07:04:47.712309 4732 generic.go:334] "Generic (PLEG): container finished" podID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerID="d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8" exitCode=0 Oct 10 07:04:47 crc kubenswrapper[4732]: I1010 07:04:47.712359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpjf" event={"ID":"83de961c-afaf-4b3d-8217-29c16843fe5c","Type":"ContainerDied","Data":"d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8"} Oct 10 07:04:48 crc kubenswrapper[4732]: I1010 07:04:48.720527 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpjf" event={"ID":"83de961c-afaf-4b3d-8217-29c16843fe5c","Type":"ContainerStarted","Data":"b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3"} Oct 10 07:04:48 crc kubenswrapper[4732]: I1010 07:04:48.746871 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6bpjf" podStartSLOduration=1.9000734129999999 podStartE2EDuration="4.746851366s" podCreationTimestamp="2025-10-10 07:04:44 +0000 UTC" firstStartedPulling="2025-10-10 07:04:45.702056682 +0000 UTC m=+812.771647923" lastFinishedPulling="2025-10-10 07:04:48.548834635 +0000 UTC m=+815.618425876" observedRunningTime="2025-10-10 07:04:48.746386764 +0000 UTC m=+815.815978025" watchObservedRunningTime="2025-10-10 07:04:48.746851366 +0000 UTC m=+815.816442607" Oct 10 07:04:54 crc kubenswrapper[4732]: I1010 07:04:54.706150 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:54 crc kubenswrapper[4732]: I1010 07:04:54.706875 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:54 crc kubenswrapper[4732]: I1010 07:04:54.747978 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:54 crc kubenswrapper[4732]: I1010 07:04:54.808905 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:54 crc kubenswrapper[4732]: I1010 07:04:54.985823 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bpjf"] Oct 10 07:04:56 crc kubenswrapper[4732]: I1010 07:04:56.766731 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6bpjf" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="registry-server" containerID="cri-o://b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3" gracePeriod=2 Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.145067 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.319955 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qkz9\" (UniqueName: \"kubernetes.io/projected/83de961c-afaf-4b3d-8217-29c16843fe5c-kube-api-access-8qkz9\") pod \"83de961c-afaf-4b3d-8217-29c16843fe5c\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.320057 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-utilities\") pod \"83de961c-afaf-4b3d-8217-29c16843fe5c\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.320123 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-catalog-content\") pod \"83de961c-afaf-4b3d-8217-29c16843fe5c\" (UID: \"83de961c-afaf-4b3d-8217-29c16843fe5c\") " Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.320888 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-utilities" (OuterVolumeSpecName: "utilities") pod "83de961c-afaf-4b3d-8217-29c16843fe5c" (UID: "83de961c-afaf-4b3d-8217-29c16843fe5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.326029 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83de961c-afaf-4b3d-8217-29c16843fe5c-kube-api-access-8qkz9" (OuterVolumeSpecName: "kube-api-access-8qkz9") pod "83de961c-afaf-4b3d-8217-29c16843fe5c" (UID: "83de961c-afaf-4b3d-8217-29c16843fe5c"). InnerVolumeSpecName "kube-api-access-8qkz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.421473 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.421517 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qkz9\" (UniqueName: \"kubernetes.io/projected/83de961c-afaf-4b3d-8217-29c16843fe5c-kube-api-access-8qkz9\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.772576 4732 generic.go:334] "Generic (PLEG): container finished" podID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerID="b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3" exitCode=0 Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.772651 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bpjf" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.772672 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpjf" event={"ID":"83de961c-afaf-4b3d-8217-29c16843fe5c","Type":"ContainerDied","Data":"b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3"} Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.773104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bpjf" event={"ID":"83de961c-afaf-4b3d-8217-29c16843fe5c","Type":"ContainerDied","Data":"f88091b410a6826211d5616688e1664c0e3506f182894d44c8c6da45ee43e90a"} Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.773124 4732 scope.go:117] "RemoveContainer" containerID="b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.800183 4732 scope.go:117] "RemoveContainer" containerID="d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.801435 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8t64z"] Oct 10 07:04:57 crc kubenswrapper[4732]: E1010 07:04:57.803126 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="extract-utilities" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.803147 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="extract-utilities" Oct 10 07:04:57 crc kubenswrapper[4732]: E1010 07:04:57.803157 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="extract-content" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.803163 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="extract-content" Oct 10 07:04:57 crc kubenswrapper[4732]: E1010 07:04:57.803185 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="registry-server" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.803193 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="registry-server" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.803331 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" containerName="registry-server" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.804560 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.808797 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t64z"] Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.840634 4732 scope.go:117] "RemoveContainer" containerID="ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.862097 4732 scope.go:117] "RemoveContainer" containerID="b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3" Oct 10 07:04:57 crc kubenswrapper[4732]: E1010 07:04:57.862780 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3\": container with ID starting with b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3 not found: ID does not exist" containerID="b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.862825 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3"} err="failed to get container status \"b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3\": rpc error: code = NotFound desc = could not find container \"b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3\": container with ID starting with b9f6346cad68a04dc080302dac4ccd6c258b3c205954e03213c1b4726f6dcae3 not found: ID does not exist" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.862853 4732 scope.go:117] "RemoveContainer" containerID="d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8" Oct 10 07:04:57 crc kubenswrapper[4732]: E1010 07:04:57.863359 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8\": container with ID starting with d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8 not found: ID does not exist" containerID="d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.863392 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8"} err="failed to get container status \"d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8\": rpc error: code = NotFound desc = could not find container \"d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8\": container with ID starting with d20686276e42facc12dbed03191912281c2d8a570f4ef358fd0a51e1ef5915e8 not found: ID does not exist" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.863414 4732 scope.go:117] "RemoveContainer" containerID="ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31" Oct 10 07:04:57 crc kubenswrapper[4732]: E1010 07:04:57.863736 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31\": container with ID starting with ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31 not found: ID does not exist" containerID="ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.863763 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31"} err="failed to get container status \"ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31\": rpc error: code = NotFound desc = could not find container \"ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31\": container with ID starting with ea56efe78db2db38e19aab725d6bfa96d43c2addd81d84ddaf95ff11d092ef31 not found: ID does not exist" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.926791 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-utilities\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.926874 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-catalog-content\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:57 crc kubenswrapper[4732]: I1010 07:04:57.926999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-296q6\" (UniqueName: \"kubernetes.io/projected/22b90825-cb7e-4a96-9f89-77ebf442e1da-kube-api-access-296q6\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.003097 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83de961c-afaf-4b3d-8217-29c16843fe5c" (UID: "83de961c-afaf-4b3d-8217-29c16843fe5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.028018 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-utilities\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.028114 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-catalog-content\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.028145 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-296q6\" (UniqueName: \"kubernetes.io/projected/22b90825-cb7e-4a96-9f89-77ebf442e1da-kube-api-access-296q6\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.028724 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83de961c-afaf-4b3d-8217-29c16843fe5c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.029213 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-utilities\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.029295 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-catalog-content\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.049822 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-296q6\" (UniqueName: \"kubernetes.io/projected/22b90825-cb7e-4a96-9f89-77ebf442e1da-kube-api-access-296q6\") pod \"redhat-marketplace-8t64z\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.103956 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bpjf"] Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.109451 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6bpjf"] Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.143516 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.578610 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t64z"] Oct 10 07:04:58 crc kubenswrapper[4732]: W1010 07:04:58.578830 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b90825_cb7e_4a96_9f89_77ebf442e1da.slice/crio-a6d75ecbcb87ace84b03ca36756517710b696839580a9c531b7b79b87715715b WatchSource:0}: Error finding container a6d75ecbcb87ace84b03ca36756517710b696839580a9c531b7b79b87715715b: Status 404 returned error can't find the container with id a6d75ecbcb87ace84b03ca36756517710b696839580a9c531b7b79b87715715b Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.787288 4732 generic.go:334] "Generic (PLEG): container finished" podID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerID="3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb" exitCode=0 Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.787425 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t64z" event={"ID":"22b90825-cb7e-4a96-9f89-77ebf442e1da","Type":"ContainerDied","Data":"3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb"} Oct 10 07:04:58 crc kubenswrapper[4732]: I1010 07:04:58.787745 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t64z" event={"ID":"22b90825-cb7e-4a96-9f89-77ebf442e1da","Type":"ContainerStarted","Data":"a6d75ecbcb87ace84b03ca36756517710b696839580a9c531b7b79b87715715b"} Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.197290 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qq5dz"] Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.199611 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.206245 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qq5dz"] Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.243651 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bvcj\" (UniqueName: \"kubernetes.io/projected/4197f98f-5abe-4dc3-a776-5a54faab28b5-kube-api-access-4bvcj\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.243813 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-catalog-content\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.243859 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-utilities\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.344708 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bvcj\" (UniqueName: \"kubernetes.io/projected/4197f98f-5abe-4dc3-a776-5a54faab28b5-kube-api-access-4bvcj\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.344816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-catalog-content\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.344840 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-utilities\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.345332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-utilities\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.345438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-catalog-content\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.366491 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bvcj\" (UniqueName: \"kubernetes.io/projected/4197f98f-5abe-4dc3-a776-5a54faab28b5-kube-api-access-4bvcj\") pod \"certified-operators-qq5dz\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.517661 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.675073 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83de961c-afaf-4b3d-8217-29c16843fe5c" path="/var/lib/kubelet/pods/83de961c-afaf-4b3d-8217-29c16843fe5c/volumes" Oct 10 07:04:59 crc kubenswrapper[4732]: I1010 07:04:59.982569 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qq5dz"] Oct 10 07:05:00 crc kubenswrapper[4732]: I1010 07:05:00.808977 4732 generic.go:334] "Generic (PLEG): container finished" podID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerID="3198ac2c144776282c556210a27a037fd05784f7b22bcc484f50b07d4180ff77" exitCode=0 Oct 10 07:05:00 crc kubenswrapper[4732]: I1010 07:05:00.809028 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qq5dz" event={"ID":"4197f98f-5abe-4dc3-a776-5a54faab28b5","Type":"ContainerDied","Data":"3198ac2c144776282c556210a27a037fd05784f7b22bcc484f50b07d4180ff77"} Oct 10 07:05:00 crc kubenswrapper[4732]: I1010 07:05:00.809591 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qq5dz" event={"ID":"4197f98f-5abe-4dc3-a776-5a54faab28b5","Type":"ContainerStarted","Data":"2bec00598d6586bacfb0c722d0e5c6d4f560c5cbac2589173379c46f8bb44083"} Oct 10 07:05:00 crc kubenswrapper[4732]: I1010 07:05:00.811539 4732 generic.go:334] "Generic (PLEG): container finished" podID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerID="ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4" exitCode=0 Oct 10 07:05:00 crc kubenswrapper[4732]: I1010 07:05:00.811564 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t64z" event={"ID":"22b90825-cb7e-4a96-9f89-77ebf442e1da","Type":"ContainerDied","Data":"ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4"} Oct 10 07:05:01 crc kubenswrapper[4732]: I1010 07:05:01.817498 4732 generic.go:334] "Generic (PLEG): container finished" podID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerID="b1ed1a31ef24d16d48beccae55e1e3e7634d1b8c77ec49455e5c86c97eabcd90" exitCode=0 Oct 10 07:05:01 crc kubenswrapper[4732]: I1010 07:05:01.817889 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qq5dz" event={"ID":"4197f98f-5abe-4dc3-a776-5a54faab28b5","Type":"ContainerDied","Data":"b1ed1a31ef24d16d48beccae55e1e3e7634d1b8c77ec49455e5c86c97eabcd90"} Oct 10 07:05:01 crc kubenswrapper[4732]: I1010 07:05:01.823219 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t64z" event={"ID":"22b90825-cb7e-4a96-9f89-77ebf442e1da","Type":"ContainerStarted","Data":"30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f"} Oct 10 07:05:01 crc kubenswrapper[4732]: I1010 07:05:01.852069 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8t64z" podStartSLOduration=2.383084377 podStartE2EDuration="4.852051926s" podCreationTimestamp="2025-10-10 07:04:57 +0000 UTC" firstStartedPulling="2025-10-10 07:04:58.788594323 +0000 UTC m=+825.858185564" lastFinishedPulling="2025-10-10 07:05:01.257561872 +0000 UTC m=+828.327153113" observedRunningTime="2025-10-10 07:05:01.851286236 +0000 UTC m=+828.920877487" watchObservedRunningTime="2025-10-10 07:05:01.852051926 +0000 UTC m=+828.921643167" Oct 10 07:05:02 crc kubenswrapper[4732]: I1010 07:05:02.830994 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qq5dz" event={"ID":"4197f98f-5abe-4dc3-a776-5a54faab28b5","Type":"ContainerStarted","Data":"3504e0349f09b1ce16937813f2c9101f90f6712654ee4614da91e8fc1e6f5aa2"} Oct 10 07:05:02 crc kubenswrapper[4732]: I1010 07:05:02.850736 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qq5dz" podStartSLOduration=2.416689655 podStartE2EDuration="3.85071432s" podCreationTimestamp="2025-10-10 07:04:59 +0000 UTC" firstStartedPulling="2025-10-10 07:05:00.810649482 +0000 UTC m=+827.880240743" lastFinishedPulling="2025-10-10 07:05:02.244674167 +0000 UTC m=+829.314265408" observedRunningTime="2025-10-10 07:05:02.846863947 +0000 UTC m=+829.916455198" watchObservedRunningTime="2025-10-10 07:05:02.85071432 +0000 UTC m=+829.920305581" Oct 10 07:05:04 crc kubenswrapper[4732]: I1010 07:05:04.621163 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-686c95bfd-424n7" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.289880 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg"] Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.290721 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.298755 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-62wtx"] Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.301117 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.301369 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-mbdzh" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.301537 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.306040 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.306096 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.315529 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg"] Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425488 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-sockets\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425563 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9748f00d-5c14-4cde-aea7-6d364ca08325-metrics-certs\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425623 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-reloader\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425720 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-metrics\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425761 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-startup\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425787 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6g67\" (UniqueName: \"kubernetes.io/projected/9748f00d-5c14-4cde-aea7-6d364ca08325-kube-api-access-c6g67\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425892 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cda66c6-9c63-41d3-b614-16bf38d53346-cert\") pod \"frr-k8s-webhook-server-64bf5d555-tnccg\" (UID: \"5cda66c6-9c63-41d3-b614-16bf38d53346\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.425974 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-conf\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.426006 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4n2m\" (UniqueName: \"kubernetes.io/projected/5cda66c6-9c63-41d3-b614-16bf38d53346-kube-api-access-h4n2m\") pod \"frr-k8s-webhook-server-64bf5d555-tnccg\" (UID: \"5cda66c6-9c63-41d3-b614-16bf38d53346\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.439900 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gchds"] Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.440965 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.448281 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5g4ng" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.448387 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.449748 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.449818 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.459914 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-xnbhc"] Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.461201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.463732 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.484626 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xnbhc"] Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527033 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-sockets\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527086 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9748f00d-5c14-4cde-aea7-6d364ca08325-metrics-certs\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527116 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-reloader\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-metrics\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527160 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-startup\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527180 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6g67\" (UniqueName: \"kubernetes.io/projected/9748f00d-5c14-4cde-aea7-6d364ca08325-kube-api-access-c6g67\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527206 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cda66c6-9c63-41d3-b614-16bf38d53346-cert\") pod \"frr-k8s-webhook-server-64bf5d555-tnccg\" (UID: \"5cda66c6-9c63-41d3-b614-16bf38d53346\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527226 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-conf\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527241 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4n2m\" (UniqueName: \"kubernetes.io/projected/5cda66c6-9c63-41d3-b614-16bf38d53346-kube-api-access-h4n2m\") pod \"frr-k8s-webhook-server-64bf5d555-tnccg\" (UID: \"5cda66c6-9c63-41d3-b614-16bf38d53346\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.527893 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-sockets\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.529121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-reloader\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.529294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-conf\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.529349 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9748f00d-5c14-4cde-aea7-6d364ca08325-metrics\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.529836 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9748f00d-5c14-4cde-aea7-6d364ca08325-frr-startup\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.534362 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5cda66c6-9c63-41d3-b614-16bf38d53346-cert\") pod \"frr-k8s-webhook-server-64bf5d555-tnccg\" (UID: \"5cda66c6-9c63-41d3-b614-16bf38d53346\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.544164 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9748f00d-5c14-4cde-aea7-6d364ca08325-metrics-certs\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.546242 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4n2m\" (UniqueName: \"kubernetes.io/projected/5cda66c6-9c63-41d3-b614-16bf38d53346-kube-api-access-h4n2m\") pod \"frr-k8s-webhook-server-64bf5d555-tnccg\" (UID: \"5cda66c6-9c63-41d3-b614-16bf38d53346\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.546970 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6g67\" (UniqueName: \"kubernetes.io/projected/9748f00d-5c14-4cde-aea7-6d364ca08325-kube-api-access-c6g67\") pod \"frr-k8s-62wtx\" (UID: \"9748f00d-5c14-4cde-aea7-6d364ca08325\") " pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.609393 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.617440 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.628672 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5r6\" (UniqueName: \"kubernetes.io/projected/fef199bb-8521-40db-9f75-221274c9299d-kube-api-access-pk5r6\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.628750 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/118f95da-3c5d-403c-90ff-9a91056fa449-metrics-certs\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.628830 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/118f95da-3c5d-403c-90ff-9a91056fa449-cert\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.628908 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-metrics-certs\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.628961 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.629041 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2x5\" (UniqueName: \"kubernetes.io/projected/118f95da-3c5d-403c-90ff-9a91056fa449-kube-api-access-8b2x5\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.629170 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fef199bb-8521-40db-9f75-221274c9299d-metallb-excludel2\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.730620 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2x5\" (UniqueName: \"kubernetes.io/projected/118f95da-3c5d-403c-90ff-9a91056fa449-kube-api-access-8b2x5\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.730673 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fef199bb-8521-40db-9f75-221274c9299d-metallb-excludel2\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.730729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5r6\" (UniqueName: \"kubernetes.io/projected/fef199bb-8521-40db-9f75-221274c9299d-kube-api-access-pk5r6\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.730755 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/118f95da-3c5d-403c-90ff-9a91056fa449-metrics-certs\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.730816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/118f95da-3c5d-403c-90ff-9a91056fa449-cert\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.730849 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-metrics-certs\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.730880 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: E1010 07:05:05.731030 4732 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 10 07:05:05 crc kubenswrapper[4732]: E1010 07:05:05.731092 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist podName:fef199bb-8521-40db-9f75-221274c9299d nodeName:}" failed. No retries permitted until 2025-10-10 07:05:06.231070181 +0000 UTC m=+833.300661422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist") pod "speaker-gchds" (UID: "fef199bb-8521-40db-9f75-221274c9299d") : secret "metallb-memberlist" not found Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.735832 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/fef199bb-8521-40db-9f75-221274c9299d-metallb-excludel2\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.740259 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/118f95da-3c5d-403c-90ff-9a91056fa449-cert\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.740583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-metrics-certs\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.740428 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/118f95da-3c5d-403c-90ff-9a91056fa449-metrics-certs\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.748360 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2x5\" (UniqueName: \"kubernetes.io/projected/118f95da-3c5d-403c-90ff-9a91056fa449-kube-api-access-8b2x5\") pod \"controller-68d546b9d8-xnbhc\" (UID: \"118f95da-3c5d-403c-90ff-9a91056fa449\") " pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.749285 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5r6\" (UniqueName: \"kubernetes.io/projected/fef199bb-8521-40db-9f75-221274c9299d-kube-api-access-pk5r6\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.774141 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:05 crc kubenswrapper[4732]: I1010 07:05:05.853488 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerStarted","Data":"73714c7a151323b0fa171d6848df624452e057c2da3dbfd26ce9ed35be3865bc"} Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.042047 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg"] Oct 10 07:05:06 crc kubenswrapper[4732]: W1010 07:05:06.046415 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cda66c6_9c63_41d3_b614_16bf38d53346.slice/crio-eccd5d541449d2f52669f437de2db559db530f3fc8f23a9b3b4cd30ebd3da1f3 WatchSource:0}: Error finding container eccd5d541449d2f52669f437de2db559db530f3fc8f23a9b3b4cd30ebd3da1f3: Status 404 returned error can't find the container with id eccd5d541449d2f52669f437de2db559db530f3fc8f23a9b3b4cd30ebd3da1f3 Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.150047 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xnbhc"] Oct 10 07:05:06 crc kubenswrapper[4732]: W1010 07:05:06.155542 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod118f95da_3c5d_403c_90ff_9a91056fa449.slice/crio-a9ab7d348ad1480514a0a65e1c99179d87584ce8a1d39994097f51546b8d7bb2 WatchSource:0}: Error finding container a9ab7d348ad1480514a0a65e1c99179d87584ce8a1d39994097f51546b8d7bb2: Status 404 returned error can't find the container with id a9ab7d348ad1480514a0a65e1c99179d87584ce8a1d39994097f51546b8d7bb2 Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.239628 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:06 crc kubenswrapper[4732]: E1010 07:05:06.239794 4732 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 10 07:05:06 crc kubenswrapper[4732]: E1010 07:05:06.239842 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist podName:fef199bb-8521-40db-9f75-221274c9299d nodeName:}" failed. No retries permitted until 2025-10-10 07:05:07.239828275 +0000 UTC m=+834.309419516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist") pod "speaker-gchds" (UID: "fef199bb-8521-40db-9f75-221274c9299d") : secret "metallb-memberlist" not found Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.860854 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xnbhc" event={"ID":"118f95da-3c5d-403c-90ff-9a91056fa449","Type":"ContainerStarted","Data":"9238721b112baa14190c0692ec4cfa3e13bd2038ad8ce27f61672f96b2bc7ae1"} Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.861134 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.861145 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xnbhc" event={"ID":"118f95da-3c5d-403c-90ff-9a91056fa449","Type":"ContainerStarted","Data":"4db756c261419cb8807f07dd6efc35b90dc95e3357acc79abf74ea6a5500be6c"} Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.861155 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xnbhc" event={"ID":"118f95da-3c5d-403c-90ff-9a91056fa449","Type":"ContainerStarted","Data":"a9ab7d348ad1480514a0a65e1c99179d87584ce8a1d39994097f51546b8d7bb2"} Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.862277 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" event={"ID":"5cda66c6-9c63-41d3-b614-16bf38d53346","Type":"ContainerStarted","Data":"eccd5d541449d2f52669f437de2db559db530f3fc8f23a9b3b4cd30ebd3da1f3"} Oct 10 07:05:06 crc kubenswrapper[4732]: I1010 07:05:06.884802 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-xnbhc" podStartSLOduration=1.884785678 podStartE2EDuration="1.884785678s" podCreationTimestamp="2025-10-10 07:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:05:06.881944232 +0000 UTC m=+833.951535493" watchObservedRunningTime="2025-10-10 07:05:06.884785678 +0000 UTC m=+833.954376919" Oct 10 07:05:07 crc kubenswrapper[4732]: I1010 07:05:07.255088 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:07 crc kubenswrapper[4732]: I1010 07:05:07.262855 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/fef199bb-8521-40db-9f75-221274c9299d-memberlist\") pod \"speaker-gchds\" (UID: \"fef199bb-8521-40db-9f75-221274c9299d\") " pod="metallb-system/speaker-gchds" Oct 10 07:05:07 crc kubenswrapper[4732]: I1010 07:05:07.554955 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gchds" Oct 10 07:05:07 crc kubenswrapper[4732]: I1010 07:05:07.874773 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gchds" event={"ID":"fef199bb-8521-40db-9f75-221274c9299d","Type":"ContainerStarted","Data":"3fb0ee6f61ac5c22403d801bbdddf8390d30e4372373f17b22f0038a4c119e5a"} Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.144276 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.144343 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.219592 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.885840 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gchds" event={"ID":"fef199bb-8521-40db-9f75-221274c9299d","Type":"ContainerStarted","Data":"7c0ab0f883f2cd411f37d48a493d406e68db446f3db0054a41b4fe78e233effd"} Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.885895 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gchds" event={"ID":"fef199bb-8521-40db-9f75-221274c9299d","Type":"ContainerStarted","Data":"bfb10bbfd8c4c50ff239aba4ece9e369c4fb6ca625cb1024173a8410d91ffadc"} Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.886132 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gchds" Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.915431 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gchds" podStartSLOduration=3.915410755 podStartE2EDuration="3.915410755s" podCreationTimestamp="2025-10-10 07:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:05:08.910414751 +0000 UTC m=+835.980006002" watchObservedRunningTime="2025-10-10 07:05:08.915410755 +0000 UTC m=+835.985001996" Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.944651 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:05:08 crc kubenswrapper[4732]: I1010 07:05:08.999760 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t64z"] Oct 10 07:05:09 crc kubenswrapper[4732]: I1010 07:05:09.518228 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:05:09 crc kubenswrapper[4732]: I1010 07:05:09.518290 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:05:09 crc kubenswrapper[4732]: I1010 07:05:09.612473 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:05:09 crc kubenswrapper[4732]: I1010 07:05:09.949401 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:05:10 crc kubenswrapper[4732]: I1010 07:05:10.850736 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qq5dz"] Oct 10 07:05:10 crc kubenswrapper[4732]: I1010 07:05:10.895946 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8t64z" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="registry-server" containerID="cri-o://30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f" gracePeriod=2 Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.331758 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.445675 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-296q6\" (UniqueName: \"kubernetes.io/projected/22b90825-cb7e-4a96-9f89-77ebf442e1da-kube-api-access-296q6\") pod \"22b90825-cb7e-4a96-9f89-77ebf442e1da\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.445772 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-utilities\") pod \"22b90825-cb7e-4a96-9f89-77ebf442e1da\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.445804 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-catalog-content\") pod \"22b90825-cb7e-4a96-9f89-77ebf442e1da\" (UID: \"22b90825-cb7e-4a96-9f89-77ebf442e1da\") " Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.446733 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-utilities" (OuterVolumeSpecName: "utilities") pod "22b90825-cb7e-4a96-9f89-77ebf442e1da" (UID: "22b90825-cb7e-4a96-9f89-77ebf442e1da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.455336 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b90825-cb7e-4a96-9f89-77ebf442e1da-kube-api-access-296q6" (OuterVolumeSpecName: "kube-api-access-296q6") pod "22b90825-cb7e-4a96-9f89-77ebf442e1da" (UID: "22b90825-cb7e-4a96-9f89-77ebf442e1da"). InnerVolumeSpecName "kube-api-access-296q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.460078 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22b90825-cb7e-4a96-9f89-77ebf442e1da" (UID: "22b90825-cb7e-4a96-9f89-77ebf442e1da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.547183 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-296q6\" (UniqueName: \"kubernetes.io/projected/22b90825-cb7e-4a96-9f89-77ebf442e1da-kube-api-access-296q6\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.547219 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.547229 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22b90825-cb7e-4a96-9f89-77ebf442e1da-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.909500 4732 generic.go:334] "Generic (PLEG): container finished" podID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerID="30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f" exitCode=0 Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.909554 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8t64z" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.909672 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t64z" event={"ID":"22b90825-cb7e-4a96-9f89-77ebf442e1da","Type":"ContainerDied","Data":"30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f"} Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.909721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8t64z" event={"ID":"22b90825-cb7e-4a96-9f89-77ebf442e1da","Type":"ContainerDied","Data":"a6d75ecbcb87ace84b03ca36756517710b696839580a9c531b7b79b87715715b"} Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.909739 4732 scope.go:117] "RemoveContainer" containerID="30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f" Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.910264 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qq5dz" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="registry-server" containerID="cri-o://3504e0349f09b1ce16937813f2c9101f90f6712654ee4614da91e8fc1e6f5aa2" gracePeriod=2 Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.941752 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t64z"] Oct 10 07:05:11 crc kubenswrapper[4732]: I1010 07:05:11.945515 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8t64z"] Oct 10 07:05:12 crc kubenswrapper[4732]: I1010 07:05:12.916471 4732 generic.go:334] "Generic (PLEG): container finished" podID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerID="3504e0349f09b1ce16937813f2c9101f90f6712654ee4614da91e8fc1e6f5aa2" exitCode=0 Oct 10 07:05:12 crc kubenswrapper[4732]: I1010 07:05:12.916517 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qq5dz" event={"ID":"4197f98f-5abe-4dc3-a776-5a54faab28b5","Type":"ContainerDied","Data":"3504e0349f09b1ce16937813f2c9101f90f6712654ee4614da91e8fc1e6f5aa2"} Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.673854 4732 scope.go:117] "RemoveContainer" containerID="ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.677609 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" path="/var/lib/kubelet/pods/22b90825-cb7e-4a96-9f89-77ebf442e1da/volumes" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.745028 4732 scope.go:117] "RemoveContainer" containerID="3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.780094 4732 scope.go:117] "RemoveContainer" containerID="30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f" Oct 10 07:05:13 crc kubenswrapper[4732]: E1010 07:05:13.780602 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f\": container with ID starting with 30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f not found: ID does not exist" containerID="30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.780656 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f"} err="failed to get container status \"30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f\": rpc error: code = NotFound desc = could not find container \"30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f\": container with ID starting with 30d869f0dcba20f46420bed7cebb3cde5750bce15181555722a09ec6506c465f not found: ID does not exist" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.780785 4732 scope.go:117] "RemoveContainer" containerID="ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4" Oct 10 07:05:13 crc kubenswrapper[4732]: E1010 07:05:13.781246 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4\": container with ID starting with ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4 not found: ID does not exist" containerID="ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.781283 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4"} err="failed to get container status \"ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4\": rpc error: code = NotFound desc = could not find container \"ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4\": container with ID starting with ab4f4a2ac5d64a28806aa361df023afa154ec5237650c461c7179398e1b8f8d4 not found: ID does not exist" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.781306 4732 scope.go:117] "RemoveContainer" containerID="3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb" Oct 10 07:05:13 crc kubenswrapper[4732]: E1010 07:05:13.781607 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb\": container with ID starting with 3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb not found: ID does not exist" containerID="3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.781658 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb"} err="failed to get container status \"3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb\": rpc error: code = NotFound desc = could not find container \"3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb\": container with ID starting with 3ef766fd3b59022e4fd0a4acff7f6cdf9c18aa3e8349a2e74e2be3348a55a2fb not found: ID does not exist" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.924225 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.931911 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qq5dz" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.931912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qq5dz" event={"ID":"4197f98f-5abe-4dc3-a776-5a54faab28b5","Type":"ContainerDied","Data":"2bec00598d6586bacfb0c722d0e5c6d4f560c5cbac2589173379c46f8bb44083"} Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.932099 4732 scope.go:117] "RemoveContainer" containerID="3504e0349f09b1ce16937813f2c9101f90f6712654ee4614da91e8fc1e6f5aa2" Oct 10 07:05:13 crc kubenswrapper[4732]: I1010 07:05:13.967236 4732 scope.go:117] "RemoveContainer" containerID="b1ed1a31ef24d16d48beccae55e1e3e7634d1b8c77ec49455e5c86c97eabcd90" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.015547 4732 scope.go:117] "RemoveContainer" containerID="3198ac2c144776282c556210a27a037fd05784f7b22bcc484f50b07d4180ff77" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.075100 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-utilities\") pod \"4197f98f-5abe-4dc3-a776-5a54faab28b5\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.075250 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bvcj\" (UniqueName: \"kubernetes.io/projected/4197f98f-5abe-4dc3-a776-5a54faab28b5-kube-api-access-4bvcj\") pod \"4197f98f-5abe-4dc3-a776-5a54faab28b5\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.075271 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-catalog-content\") pod \"4197f98f-5abe-4dc3-a776-5a54faab28b5\" (UID: \"4197f98f-5abe-4dc3-a776-5a54faab28b5\") " Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.076741 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-utilities" (OuterVolumeSpecName: "utilities") pod "4197f98f-5abe-4dc3-a776-5a54faab28b5" (UID: "4197f98f-5abe-4dc3-a776-5a54faab28b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.082376 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4197f98f-5abe-4dc3-a776-5a54faab28b5-kube-api-access-4bvcj" (OuterVolumeSpecName: "kube-api-access-4bvcj") pod "4197f98f-5abe-4dc3-a776-5a54faab28b5" (UID: "4197f98f-5abe-4dc3-a776-5a54faab28b5"). InnerVolumeSpecName "kube-api-access-4bvcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.119341 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4197f98f-5abe-4dc3-a776-5a54faab28b5" (UID: "4197f98f-5abe-4dc3-a776-5a54faab28b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.176555 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bvcj\" (UniqueName: \"kubernetes.io/projected/4197f98f-5abe-4dc3-a776-5a54faab28b5-kube-api-access-4bvcj\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.176607 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.176620 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4197f98f-5abe-4dc3-a776-5a54faab28b5-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.266802 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qq5dz"] Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.272303 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qq5dz"] Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.938903 4732 generic.go:334] "Generic (PLEG): container finished" podID="9748f00d-5c14-4cde-aea7-6d364ca08325" containerID="eef6e73a6bd7e751d92b6263d67a835237a965ad8f6a9ab82e6f869bcbdde8e9" exitCode=0 Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.938976 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerDied","Data":"eef6e73a6bd7e751d92b6263d67a835237a965ad8f6a9ab82e6f869bcbdde8e9"} Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.941264 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" event={"ID":"5cda66c6-9c63-41d3-b614-16bf38d53346","Type":"ContainerStarted","Data":"0b1caa57fb203a03a1a42554a486e794e451472bec5757e73bc38ce77c4e27e9"} Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.941400 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:14 crc kubenswrapper[4732]: I1010 07:05:14.986753 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" podStartSLOduration=2.242387774 podStartE2EDuration="9.986728987s" podCreationTimestamp="2025-10-10 07:05:05 +0000 UTC" firstStartedPulling="2025-10-10 07:05:06.048539094 +0000 UTC m=+833.118130345" lastFinishedPulling="2025-10-10 07:05:13.792880317 +0000 UTC m=+840.862471558" observedRunningTime="2025-10-10 07:05:14.982272177 +0000 UTC m=+842.051863418" watchObservedRunningTime="2025-10-10 07:05:14.986728987 +0000 UTC m=+842.056320228" Oct 10 07:05:15 crc kubenswrapper[4732]: I1010 07:05:15.669316 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" path="/var/lib/kubelet/pods/4197f98f-5abe-4dc3-a776-5a54faab28b5/volumes" Oct 10 07:05:15 crc kubenswrapper[4732]: I1010 07:05:15.947574 4732 generic.go:334] "Generic (PLEG): container finished" podID="9748f00d-5c14-4cde-aea7-6d364ca08325" containerID="726aa9bc5a0d00ad7f3be14b3d44573baab7ec3fb03eae5fe4d196d8639d5547" exitCode=0 Oct 10 07:05:15 crc kubenswrapper[4732]: I1010 07:05:15.947662 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerDied","Data":"726aa9bc5a0d00ad7f3be14b3d44573baab7ec3fb03eae5fe4d196d8639d5547"} Oct 10 07:05:16 crc kubenswrapper[4732]: I1010 07:05:16.954893 4732 generic.go:334] "Generic (PLEG): container finished" podID="9748f00d-5c14-4cde-aea7-6d364ca08325" containerID="d45715562058ec36692ac7bafc0af4079a7bc3b39406f94b9336d0da07bc8079" exitCode=0 Oct 10 07:05:16 crc kubenswrapper[4732]: I1010 07:05:16.955007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerDied","Data":"d45715562058ec36692ac7bafc0af4079a7bc3b39406f94b9336d0da07bc8079"} Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.558424 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gchds" Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.971600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerStarted","Data":"ba64f3075293c1210a2d0e407037a5dae1bb2a9009070e481e971de83ae2ec04"} Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.971644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerStarted","Data":"68211c151005308bdf9097e75dbafad5d9be8d875e3707fa4a84d7ca7d7892db"} Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.971654 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerStarted","Data":"f4442fa43f1e3953fffb225213604505e80234e3300b39e6ee0c154c01a7b4c3"} Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.971663 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerStarted","Data":"b6df46d032ee4e7153f1bf13ed16bff1470ca9aee11c90ca205b1a0d8012766e"} Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.971673 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerStarted","Data":"23d7f446518218a5beb7b7e121db8f7af1212482cce0328b858271acd49e1845"} Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.971681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-62wtx" event={"ID":"9748f00d-5c14-4cde-aea7-6d364ca08325","Type":"ContainerStarted","Data":"1f5b7071c9ef9ecb4c8b6e124014cd95713ce4b5a491b574565361228fa34fd3"} Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.971837 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:17 crc kubenswrapper[4732]: I1010 07:05:17.994578 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-62wtx" podStartSLOduration=4.958591181 podStartE2EDuration="12.994367202s" podCreationTimestamp="2025-10-10 07:05:05 +0000 UTC" firstStartedPulling="2025-10-10 07:05:05.760933279 +0000 UTC m=+832.830524520" lastFinishedPulling="2025-10-10 07:05:13.7967093 +0000 UTC m=+840.866300541" observedRunningTime="2025-10-10 07:05:17.991012012 +0000 UTC m=+845.060603263" watchObservedRunningTime="2025-10-10 07:05:17.994367202 +0000 UTC m=+845.063958443" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.099096 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq"] Oct 10 07:05:19 crc kubenswrapper[4732]: E1010 07:05:19.099728 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="registry-server" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.099745 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="registry-server" Oct 10 07:05:19 crc kubenswrapper[4732]: E1010 07:05:19.099773 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="extract-content" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.099782 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="extract-content" Oct 10 07:05:19 crc kubenswrapper[4732]: E1010 07:05:19.099796 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="extract-utilities" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.099804 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="extract-utilities" Oct 10 07:05:19 crc kubenswrapper[4732]: E1010 07:05:19.099817 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="extract-content" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.099825 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="extract-content" Oct 10 07:05:19 crc kubenswrapper[4732]: E1010 07:05:19.099838 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="extract-utilities" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.099846 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="extract-utilities" Oct 10 07:05:19 crc kubenswrapper[4732]: E1010 07:05:19.099858 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="registry-server" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.099866 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="registry-server" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.100007 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b90825-cb7e-4a96-9f89-77ebf442e1da" containerName="registry-server" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.100017 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4197f98f-5abe-4dc3-a776-5a54faab28b5" containerName="registry-server" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.100764 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.103019 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.111883 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq"] Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.241624 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp49t\" (UniqueName: \"kubernetes.io/projected/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-kube-api-access-qp49t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.241888 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.241958 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.343192 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.343249 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.343308 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp49t\" (UniqueName: \"kubernetes.io/projected/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-kube-api-access-qp49t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.343809 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.344177 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.363922 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp49t\" (UniqueName: \"kubernetes.io/projected/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-kube-api-access-qp49t\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.423641 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.841359 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq"] Oct 10 07:05:19 crc kubenswrapper[4732]: W1010 07:05:19.846593 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643a9f80_158c_4f4c_a1a1_5feac2bac0c1.slice/crio-acfe24753ce7600abf71bf1a258c51218885534f0afc2dc10c5a2ef91e55a327 WatchSource:0}: Error finding container acfe24753ce7600abf71bf1a258c51218885534f0afc2dc10c5a2ef91e55a327: Status 404 returned error can't find the container with id acfe24753ce7600abf71bf1a258c51218885534f0afc2dc10c5a2ef91e55a327 Oct 10 07:05:19 crc kubenswrapper[4732]: I1010 07:05:19.983034 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" event={"ID":"643a9f80-158c-4f4c-a1a1-5feac2bac0c1","Type":"ContainerStarted","Data":"acfe24753ce7600abf71bf1a258c51218885534f0afc2dc10c5a2ef91e55a327"} Oct 10 07:05:20 crc kubenswrapper[4732]: I1010 07:05:20.618846 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:20 crc kubenswrapper[4732]: I1010 07:05:20.692375 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:20 crc kubenswrapper[4732]: I1010 07:05:20.990149 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" event={"ID":"643a9f80-158c-4f4c-a1a1-5feac2bac0c1","Type":"ContainerStarted","Data":"0e30805de5e97051019b651a909aab38322166591bb5eb7ff731b2baa3296963"} Oct 10 07:05:21 crc kubenswrapper[4732]: I1010 07:05:21.999364 4732 generic.go:334] "Generic (PLEG): container finished" podID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerID="0e30805de5e97051019b651a909aab38322166591bb5eb7ff731b2baa3296963" exitCode=0 Oct 10 07:05:22 crc kubenswrapper[4732]: I1010 07:05:21.999417 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" event={"ID":"643a9f80-158c-4f4c-a1a1-5feac2bac0c1","Type":"ContainerDied","Data":"0e30805de5e97051019b651a909aab38322166591bb5eb7ff731b2baa3296963"} Oct 10 07:05:24 crc kubenswrapper[4732]: I1010 07:05:24.015481 4732 generic.go:334] "Generic (PLEG): container finished" podID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerID="8ca814198fd199f44a8fac41d9da6a6fcd6eddaf52e5a5fefdbc2cd0d3b05072" exitCode=0 Oct 10 07:05:24 crc kubenswrapper[4732]: I1010 07:05:24.015717 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" event={"ID":"643a9f80-158c-4f4c-a1a1-5feac2bac0c1","Type":"ContainerDied","Data":"8ca814198fd199f44a8fac41d9da6a6fcd6eddaf52e5a5fefdbc2cd0d3b05072"} Oct 10 07:05:25 crc kubenswrapper[4732]: I1010 07:05:25.024772 4732 generic.go:334] "Generic (PLEG): container finished" podID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerID="ba05e72ce5db0b678e015c980f2dbe7ab213e4a5392733dc2246ce52c0e4622d" exitCode=0 Oct 10 07:05:25 crc kubenswrapper[4732]: I1010 07:05:25.024835 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" event={"ID":"643a9f80-158c-4f4c-a1a1-5feac2bac0c1","Type":"ContainerDied","Data":"ba05e72ce5db0b678e015c980f2dbe7ab213e4a5392733dc2246ce52c0e4622d"} Oct 10 07:05:25 crc kubenswrapper[4732]: I1010 07:05:25.615777 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-tnccg" Oct 10 07:05:25 crc kubenswrapper[4732]: I1010 07:05:25.779500 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-xnbhc" Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.304032 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.436608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-util\") pod \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.436663 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp49t\" (UniqueName: \"kubernetes.io/projected/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-kube-api-access-qp49t\") pod \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.436713 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-bundle\") pod \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\" (UID: \"643a9f80-158c-4f4c-a1a1-5feac2bac0c1\") " Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.437799 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-bundle" (OuterVolumeSpecName: "bundle") pod "643a9f80-158c-4f4c-a1a1-5feac2bac0c1" (UID: "643a9f80-158c-4f4c-a1a1-5feac2bac0c1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.442833 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-kube-api-access-qp49t" (OuterVolumeSpecName: "kube-api-access-qp49t") pod "643a9f80-158c-4f4c-a1a1-5feac2bac0c1" (UID: "643a9f80-158c-4f4c-a1a1-5feac2bac0c1"). InnerVolumeSpecName "kube-api-access-qp49t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.538873 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp49t\" (UniqueName: \"kubernetes.io/projected/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-kube-api-access-qp49t\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.538908 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.796174 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-util" (OuterVolumeSpecName: "util") pod "643a9f80-158c-4f4c-a1a1-5feac2bac0c1" (UID: "643a9f80-158c-4f4c-a1a1-5feac2bac0c1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:05:26 crc kubenswrapper[4732]: I1010 07:05:26.842503 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/643a9f80-158c-4f4c-a1a1-5feac2bac0c1-util\") on node \"crc\" DevicePath \"\"" Oct 10 07:05:27 crc kubenswrapper[4732]: I1010 07:05:27.037506 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" event={"ID":"643a9f80-158c-4f4c-a1a1-5feac2bac0c1","Type":"ContainerDied","Data":"acfe24753ce7600abf71bf1a258c51218885534f0afc2dc10c5a2ef91e55a327"} Oct 10 07:05:27 crc kubenswrapper[4732]: I1010 07:05:27.037544 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acfe24753ce7600abf71bf1a258c51218885534f0afc2dc10c5a2ef91e55a327" Oct 10 07:05:27 crc kubenswrapper[4732]: I1010 07:05:27.037585 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.503432 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5"] Oct 10 07:05:32 crc kubenswrapper[4732]: E1010 07:05:32.504415 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerName="pull" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.504431 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerName="pull" Oct 10 07:05:32 crc kubenswrapper[4732]: E1010 07:05:32.504447 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerName="extract" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.504454 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerName="extract" Oct 10 07:05:32 crc kubenswrapper[4732]: E1010 07:05:32.504476 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerName="util" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.504484 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerName="util" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.504612 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="643a9f80-158c-4f4c-a1a1-5feac2bac0c1" containerName="extract" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.505094 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.507229 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.507768 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.508052 4732 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-584wj" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.522231 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5"] Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.536059 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-849d5\" (UniqueName: \"kubernetes.io/projected/691700ff-4c15-46b1-a70f-33b41ddf4fc4-kube-api-access-849d5\") pod \"cert-manager-operator-controller-manager-57cd46d6d-46sm5\" (UID: \"691700ff-4c15-46b1-a70f-33b41ddf4fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.637747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-849d5\" (UniqueName: \"kubernetes.io/projected/691700ff-4c15-46b1-a70f-33b41ddf4fc4-kube-api-access-849d5\") pod \"cert-manager-operator-controller-manager-57cd46d6d-46sm5\" (UID: \"691700ff-4c15-46b1-a70f-33b41ddf4fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.666757 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-849d5\" (UniqueName: \"kubernetes.io/projected/691700ff-4c15-46b1-a70f-33b41ddf4fc4-kube-api-access-849d5\") pod \"cert-manager-operator-controller-manager-57cd46d6d-46sm5\" (UID: \"691700ff-4c15-46b1-a70f-33b41ddf4fc4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" Oct 10 07:05:32 crc kubenswrapper[4732]: I1010 07:05:32.820558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" Oct 10 07:05:33 crc kubenswrapper[4732]: I1010 07:05:33.288699 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5"] Oct 10 07:05:33 crc kubenswrapper[4732]: W1010 07:05:33.297912 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod691700ff_4c15_46b1_a70f_33b41ddf4fc4.slice/crio-5c7913f8c1abaa023c27b114e86b4b1f620a69f3876109b9518fb987228c8715 WatchSource:0}: Error finding container 5c7913f8c1abaa023c27b114e86b4b1f620a69f3876109b9518fb987228c8715: Status 404 returned error can't find the container with id 5c7913f8c1abaa023c27b114e86b4b1f620a69f3876109b9518fb987228c8715 Oct 10 07:05:34 crc kubenswrapper[4732]: I1010 07:05:34.076071 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" event={"ID":"691700ff-4c15-46b1-a70f-33b41ddf4fc4","Type":"ContainerStarted","Data":"5c7913f8c1abaa023c27b114e86b4b1f620a69f3876109b9518fb987228c8715"} Oct 10 07:05:35 crc kubenswrapper[4732]: I1010 07:05:35.620426 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-62wtx" Oct 10 07:05:41 crc kubenswrapper[4732]: I1010 07:05:41.130861 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" event={"ID":"691700ff-4c15-46b1-a70f-33b41ddf4fc4","Type":"ContainerStarted","Data":"85819c4b6ef6c845953db4d4ed210a2960bb678490798e2024a802c594e4e489"} Oct 10 07:05:41 crc kubenswrapper[4732]: I1010 07:05:41.152009 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-46sm5" podStartSLOduration=2.280183305 podStartE2EDuration="9.151991408s" podCreationTimestamp="2025-10-10 07:05:32 +0000 UTC" firstStartedPulling="2025-10-10 07:05:33.300338435 +0000 UTC m=+860.369929696" lastFinishedPulling="2025-10-10 07:05:40.172146558 +0000 UTC m=+867.241737799" observedRunningTime="2025-10-10 07:05:41.147906088 +0000 UTC m=+868.217497329" watchObservedRunningTime="2025-10-10 07:05:41.151991408 +0000 UTC m=+868.221582659" Oct 10 07:05:43 crc kubenswrapper[4732]: I1010 07:05:43.985400 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-c7c5k"] Oct 10 07:05:43 crc kubenswrapper[4732]: I1010 07:05:43.986717 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:43 crc kubenswrapper[4732]: I1010 07:05:43.989313 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 10 07:05:43 crc kubenswrapper[4732]: I1010 07:05:43.990098 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 10 07:05:43 crc kubenswrapper[4732]: I1010 07:05:43.990277 4732 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-z7n99" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.007139 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-c7c5k"] Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.110425 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-964bn\" (UniqueName: \"kubernetes.io/projected/3db91b6b-e402-459a-b068-700c99ca4552-kube-api-access-964bn\") pod \"cert-manager-webhook-d969966f-c7c5k\" (UID: \"3db91b6b-e402-459a-b068-700c99ca4552\") " pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.110758 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db91b6b-e402-459a-b068-700c99ca4552-bound-sa-token\") pod \"cert-manager-webhook-d969966f-c7c5k\" (UID: \"3db91b6b-e402-459a-b068-700c99ca4552\") " pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.212062 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db91b6b-e402-459a-b068-700c99ca4552-bound-sa-token\") pod \"cert-manager-webhook-d969966f-c7c5k\" (UID: \"3db91b6b-e402-459a-b068-700c99ca4552\") " pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.212134 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-964bn\" (UniqueName: \"kubernetes.io/projected/3db91b6b-e402-459a-b068-700c99ca4552-kube-api-access-964bn\") pod \"cert-manager-webhook-d969966f-c7c5k\" (UID: \"3db91b6b-e402-459a-b068-700c99ca4552\") " pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.230001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3db91b6b-e402-459a-b068-700c99ca4552-bound-sa-token\") pod \"cert-manager-webhook-d969966f-c7c5k\" (UID: \"3db91b6b-e402-459a-b068-700c99ca4552\") " pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.232503 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-964bn\" (UniqueName: \"kubernetes.io/projected/3db91b6b-e402-459a-b068-700c99ca4552-kube-api-access-964bn\") pod \"cert-manager-webhook-d969966f-c7c5k\" (UID: \"3db91b6b-e402-459a-b068-700c99ca4552\") " pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.302760 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:44 crc kubenswrapper[4732]: I1010 07:05:44.779911 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-c7c5k"] Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.150268 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" event={"ID":"3db91b6b-e402-459a-b068-700c99ca4552","Type":"ContainerStarted","Data":"6a71bdbea33301c5646877b3fbac7ebdf2a7fe1ac4739bdfd060e558cc0d5f89"} Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.402416 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc"] Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.403569 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.406921 4732 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-p4rx5" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.412219 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc"] Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.528091 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpp9k\" (UniqueName: \"kubernetes.io/projected/825885af-0634-426b-92b4-e877ae53c058-kube-api-access-bpp9k\") pod \"cert-manager-cainjector-7d9f95dbf-b6fhc\" (UID: \"825885af-0634-426b-92b4-e877ae53c058\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.528150 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/825885af-0634-426b-92b4-e877ae53c058-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-b6fhc\" (UID: \"825885af-0634-426b-92b4-e877ae53c058\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.628938 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/825885af-0634-426b-92b4-e877ae53c058-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-b6fhc\" (UID: \"825885af-0634-426b-92b4-e877ae53c058\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.629074 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpp9k\" (UniqueName: \"kubernetes.io/projected/825885af-0634-426b-92b4-e877ae53c058-kube-api-access-bpp9k\") pod \"cert-manager-cainjector-7d9f95dbf-b6fhc\" (UID: \"825885af-0634-426b-92b4-e877ae53c058\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.654785 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/825885af-0634-426b-92b4-e877ae53c058-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-b6fhc\" (UID: \"825885af-0634-426b-92b4-e877ae53c058\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.670183 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpp9k\" (UniqueName: \"kubernetes.io/projected/825885af-0634-426b-92b4-e877ae53c058-kube-api-access-bpp9k\") pod \"cert-manager-cainjector-7d9f95dbf-b6fhc\" (UID: \"825885af-0634-426b-92b4-e877ae53c058\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:45 crc kubenswrapper[4732]: I1010 07:05:45.726871 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" Oct 10 07:05:46 crc kubenswrapper[4732]: I1010 07:05:46.152454 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc"] Oct 10 07:05:47 crc kubenswrapper[4732]: I1010 07:05:47.165443 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" event={"ID":"825885af-0634-426b-92b4-e877ae53c058","Type":"ContainerStarted","Data":"63e19a803b314952ac005b9483a81127cbaa4c935cadfb20b21faf468eee7386"} Oct 10 07:05:49 crc kubenswrapper[4732]: I1010 07:05:49.190893 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" event={"ID":"3db91b6b-e402-459a-b068-700c99ca4552","Type":"ContainerStarted","Data":"f864db22c5eceba6ba965374627104e5f7c6db6cbc9f3d6cf859a0a14165774f"} Oct 10 07:05:49 crc kubenswrapper[4732]: I1010 07:05:49.191310 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:49 crc kubenswrapper[4732]: I1010 07:05:49.193101 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" event={"ID":"825885af-0634-426b-92b4-e877ae53c058","Type":"ContainerStarted","Data":"fd4d764fdd3aa174995eb958e67f9ecc4570569d5fdd019b51e3b0ca146f2f4c"} Oct 10 07:05:49 crc kubenswrapper[4732]: I1010 07:05:49.207580 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" podStartSLOduration=2.170831607 podStartE2EDuration="6.207564729s" podCreationTimestamp="2025-10-10 07:05:43 +0000 UTC" firstStartedPulling="2025-10-10 07:05:44.796227716 +0000 UTC m=+871.865818957" lastFinishedPulling="2025-10-10 07:05:48.832960828 +0000 UTC m=+875.902552079" observedRunningTime="2025-10-10 07:05:49.205302139 +0000 UTC m=+876.274893400" watchObservedRunningTime="2025-10-10 07:05:49.207564729 +0000 UTC m=+876.277155970" Oct 10 07:05:49 crc kubenswrapper[4732]: I1010 07:05:49.222906 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-b6fhc" podStartSLOduration=1.561452535 podStartE2EDuration="4.222882102s" podCreationTimestamp="2025-10-10 07:05:45 +0000 UTC" firstStartedPulling="2025-10-10 07:05:46.172451725 +0000 UTC m=+873.242042966" lastFinishedPulling="2025-10-10 07:05:48.833881292 +0000 UTC m=+875.903472533" observedRunningTime="2025-10-10 07:05:49.217850106 +0000 UTC m=+876.287441347" watchObservedRunningTime="2025-10-10 07:05:49.222882102 +0000 UTC m=+876.292473363" Oct 10 07:05:54 crc kubenswrapper[4732]: I1010 07:05:54.306070 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-c7c5k" Oct 10 07:05:55 crc kubenswrapper[4732]: I1010 07:05:55.356586 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:05:55 crc kubenswrapper[4732]: I1010 07:05:55.357956 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.625907 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-7rj9k"] Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.627675 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.629726 4732 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-z6mvd" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.674203 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-7rj9k"] Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.763115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hl6t\" (UniqueName: \"kubernetes.io/projected/dd7307fa-d381-41e0-b69f-aa09aeacad83-kube-api-access-6hl6t\") pod \"cert-manager-7d4cc89fcb-7rj9k\" (UID: \"dd7307fa-d381-41e0-b69f-aa09aeacad83\") " pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.763160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd7307fa-d381-41e0-b69f-aa09aeacad83-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-7rj9k\" (UID: \"dd7307fa-d381-41e0-b69f-aa09aeacad83\") " pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.864317 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hl6t\" (UniqueName: \"kubernetes.io/projected/dd7307fa-d381-41e0-b69f-aa09aeacad83-kube-api-access-6hl6t\") pod \"cert-manager-7d4cc89fcb-7rj9k\" (UID: \"dd7307fa-d381-41e0-b69f-aa09aeacad83\") " pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.864367 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd7307fa-d381-41e0-b69f-aa09aeacad83-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-7rj9k\" (UID: \"dd7307fa-d381-41e0-b69f-aa09aeacad83\") " pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.883024 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hl6t\" (UniqueName: \"kubernetes.io/projected/dd7307fa-d381-41e0-b69f-aa09aeacad83-kube-api-access-6hl6t\") pod \"cert-manager-7d4cc89fcb-7rj9k\" (UID: \"dd7307fa-d381-41e0-b69f-aa09aeacad83\") " pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.884897 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd7307fa-d381-41e0-b69f-aa09aeacad83-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-7rj9k\" (UID: \"dd7307fa-d381-41e0-b69f-aa09aeacad83\") " pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:02 crc kubenswrapper[4732]: I1010 07:06:02.951010 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" Oct 10 07:06:03 crc kubenswrapper[4732]: I1010 07:06:03.394641 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-7rj9k"] Oct 10 07:06:04 crc kubenswrapper[4732]: I1010 07:06:04.293949 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" event={"ID":"dd7307fa-d381-41e0-b69f-aa09aeacad83","Type":"ContainerStarted","Data":"4327ed9c63ed8bb160eae78e3138f4dbe0b5b15aff442202d2f844f2a1d72dcc"} Oct 10 07:06:04 crc kubenswrapper[4732]: I1010 07:06:04.294875 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" event={"ID":"dd7307fa-d381-41e0-b69f-aa09aeacad83","Type":"ContainerStarted","Data":"cc88e8096f789c116623169e5303bed8ce839c2f5ef96d44129f4069e4d45c61"} Oct 10 07:06:04 crc kubenswrapper[4732]: I1010 07:06:04.312831 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-7rj9k" podStartSLOduration=2.312806551 podStartE2EDuration="2.312806551s" podCreationTimestamp="2025-10-10 07:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:06:04.307627441 +0000 UTC m=+891.377218712" watchObservedRunningTime="2025-10-10 07:06:04.312806551 +0000 UTC m=+891.382397822" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.511991 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hmk6f"] Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.513286 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hmk6f" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.515468 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.515594 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-qxtbp" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.517171 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.532039 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hmk6f"] Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.623270 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ggjl\" (UniqueName: \"kubernetes.io/projected/85f4da21-ae47-448a-b699-73c38c6ef7c8-kube-api-access-4ggjl\") pod \"openstack-operator-index-hmk6f\" (UID: \"85f4da21-ae47-448a-b699-73c38c6ef7c8\") " pod="openstack-operators/openstack-operator-index-hmk6f" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.724750 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ggjl\" (UniqueName: \"kubernetes.io/projected/85f4da21-ae47-448a-b699-73c38c6ef7c8-kube-api-access-4ggjl\") pod \"openstack-operator-index-hmk6f\" (UID: \"85f4da21-ae47-448a-b699-73c38c6ef7c8\") " pod="openstack-operators/openstack-operator-index-hmk6f" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.745587 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ggjl\" (UniqueName: \"kubernetes.io/projected/85f4da21-ae47-448a-b699-73c38c6ef7c8-kube-api-access-4ggjl\") pod \"openstack-operator-index-hmk6f\" (UID: \"85f4da21-ae47-448a-b699-73c38c6ef7c8\") " pod="openstack-operators/openstack-operator-index-hmk6f" Oct 10 07:06:07 crc kubenswrapper[4732]: I1010 07:06:07.830877 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hmk6f" Oct 10 07:06:08 crc kubenswrapper[4732]: I1010 07:06:08.217024 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hmk6f"] Oct 10 07:06:08 crc kubenswrapper[4732]: I1010 07:06:08.316232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hmk6f" event={"ID":"85f4da21-ae47-448a-b699-73c38c6ef7c8","Type":"ContainerStarted","Data":"d38696ce2e4658ea200721199ccb38f7717e5918f45b2a374d66a3746fe7b53d"} Oct 10 07:06:10 crc kubenswrapper[4732]: I1010 07:06:10.330610 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hmk6f" event={"ID":"85f4da21-ae47-448a-b699-73c38c6ef7c8","Type":"ContainerStarted","Data":"16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5"} Oct 10 07:06:10 crc kubenswrapper[4732]: I1010 07:06:10.350429 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hmk6f" podStartSLOduration=1.993342999 podStartE2EDuration="3.350405302s" podCreationTimestamp="2025-10-10 07:06:07 +0000 UTC" firstStartedPulling="2025-10-10 07:06:08.22906027 +0000 UTC m=+895.298651511" lastFinishedPulling="2025-10-10 07:06:09.586122573 +0000 UTC m=+896.655713814" observedRunningTime="2025-10-10 07:06:10.346820925 +0000 UTC m=+897.416412186" watchObservedRunningTime="2025-10-10 07:06:10.350405302 +0000 UTC m=+897.419996553" Oct 10 07:06:11 crc kubenswrapper[4732]: I1010 07:06:11.089531 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hmk6f"] Oct 10 07:06:11 crc kubenswrapper[4732]: I1010 07:06:11.687627 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-p8tlw"] Oct 10 07:06:11 crc kubenswrapper[4732]: I1010 07:06:11.688479 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:11 crc kubenswrapper[4732]: I1010 07:06:11.699209 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p8tlw"] Oct 10 07:06:11 crc kubenswrapper[4732]: I1010 07:06:11.879625 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm86x\" (UniqueName: \"kubernetes.io/projected/5ed64387-dd36-41ee-82e9-779579474c87-kube-api-access-cm86x\") pod \"openstack-operator-index-p8tlw\" (UID: \"5ed64387-dd36-41ee-82e9-779579474c87\") " pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:11 crc kubenswrapper[4732]: I1010 07:06:11.981235 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm86x\" (UniqueName: \"kubernetes.io/projected/5ed64387-dd36-41ee-82e9-779579474c87-kube-api-access-cm86x\") pod \"openstack-operator-index-p8tlw\" (UID: \"5ed64387-dd36-41ee-82e9-779579474c87\") " pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.000298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm86x\" (UniqueName: \"kubernetes.io/projected/5ed64387-dd36-41ee-82e9-779579474c87-kube-api-access-cm86x\") pod \"openstack-operator-index-p8tlw\" (UID: \"5ed64387-dd36-41ee-82e9-779579474c87\") " pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.005884 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.343901 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hmk6f" podUID="85f4da21-ae47-448a-b699-73c38c6ef7c8" containerName="registry-server" containerID="cri-o://16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5" gracePeriod=2 Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.385325 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-p8tlw"] Oct 10 07:06:12 crc kubenswrapper[4732]: W1010 07:06:12.437256 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ed64387_dd36_41ee_82e9_779579474c87.slice/crio-252bf619ec7e37cee92bd2bcb71a6295d777362234fc71c9c20c811baa16b4da WatchSource:0}: Error finding container 252bf619ec7e37cee92bd2bcb71a6295d777362234fc71c9c20c811baa16b4da: Status 404 returned error can't find the container with id 252bf619ec7e37cee92bd2bcb71a6295d777362234fc71c9c20c811baa16b4da Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.695648 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hmk6f" Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.895795 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ggjl\" (UniqueName: \"kubernetes.io/projected/85f4da21-ae47-448a-b699-73c38c6ef7c8-kube-api-access-4ggjl\") pod \"85f4da21-ae47-448a-b699-73c38c6ef7c8\" (UID: \"85f4da21-ae47-448a-b699-73c38c6ef7c8\") " Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.902523 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f4da21-ae47-448a-b699-73c38c6ef7c8-kube-api-access-4ggjl" (OuterVolumeSpecName: "kube-api-access-4ggjl") pod "85f4da21-ae47-448a-b699-73c38c6ef7c8" (UID: "85f4da21-ae47-448a-b699-73c38c6ef7c8"). InnerVolumeSpecName "kube-api-access-4ggjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:06:12 crc kubenswrapper[4732]: I1010 07:06:12.997387 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ggjl\" (UniqueName: \"kubernetes.io/projected/85f4da21-ae47-448a-b699-73c38c6ef7c8-kube-api-access-4ggjl\") on node \"crc\" DevicePath \"\"" Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.352392 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p8tlw" event={"ID":"5ed64387-dd36-41ee-82e9-779579474c87","Type":"ContainerStarted","Data":"10f20e917da26a765321830cfb734a01bc9c31ada4f11bf3408772c11ddc402c"} Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.352798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-p8tlw" event={"ID":"5ed64387-dd36-41ee-82e9-779579474c87","Type":"ContainerStarted","Data":"252bf619ec7e37cee92bd2bcb71a6295d777362234fc71c9c20c811baa16b4da"} Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.354212 4732 generic.go:334] "Generic (PLEG): container finished" podID="85f4da21-ae47-448a-b699-73c38c6ef7c8" containerID="16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5" exitCode=0 Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.354279 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hmk6f" event={"ID":"85f4da21-ae47-448a-b699-73c38c6ef7c8","Type":"ContainerDied","Data":"16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5"} Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.354332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hmk6f" event={"ID":"85f4da21-ae47-448a-b699-73c38c6ef7c8","Type":"ContainerDied","Data":"d38696ce2e4658ea200721199ccb38f7717e5918f45b2a374d66a3746fe7b53d"} Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.354355 4732 scope.go:117] "RemoveContainer" containerID="16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5" Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.354478 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hmk6f" Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.371010 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-p8tlw" podStartSLOduration=1.763143117 podStartE2EDuration="2.370986235s" podCreationTimestamp="2025-10-10 07:06:11 +0000 UTC" firstStartedPulling="2025-10-10 07:06:12.44149662 +0000 UTC m=+899.511087861" lastFinishedPulling="2025-10-10 07:06:13.049339738 +0000 UTC m=+900.118930979" observedRunningTime="2025-10-10 07:06:13.370374909 +0000 UTC m=+900.439966160" watchObservedRunningTime="2025-10-10 07:06:13.370986235 +0000 UTC m=+900.440577476" Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.374571 4732 scope.go:117] "RemoveContainer" containerID="16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5" Oct 10 07:06:13 crc kubenswrapper[4732]: E1010 07:06:13.375057 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5\": container with ID starting with 16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5 not found: ID does not exist" containerID="16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5" Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.375114 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5"} err="failed to get container status \"16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5\": rpc error: code = NotFound desc = could not find container \"16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5\": container with ID starting with 16604933388d1ff6000f9be6c1d7d4151f4b7ca004610368a4c681a4386c57f5 not found: ID does not exist" Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.386092 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hmk6f"] Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.389735 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hmk6f"] Oct 10 07:06:13 crc kubenswrapper[4732]: I1010 07:06:13.682386 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f4da21-ae47-448a-b699-73c38c6ef7c8" path="/var/lib/kubelet/pods/85f4da21-ae47-448a-b699-73c38c6ef7c8/volumes" Oct 10 07:06:22 crc kubenswrapper[4732]: I1010 07:06:22.006555 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:22 crc kubenswrapper[4732]: I1010 07:06:22.008007 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:22 crc kubenswrapper[4732]: I1010 07:06:22.035269 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:22 crc kubenswrapper[4732]: I1010 07:06:22.443833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-p8tlw" Oct 10 07:06:25 crc kubenswrapper[4732]: I1010 07:06:25.356050 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:06:25 crc kubenswrapper[4732]: I1010 07:06:25.356515 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.599633 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg"] Oct 10 07:06:27 crc kubenswrapper[4732]: E1010 07:06:27.599946 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f4da21-ae47-448a-b699-73c38c6ef7c8" containerName="registry-server" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.599963 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f4da21-ae47-448a-b699-73c38c6ef7c8" containerName="registry-server" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.600082 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f4da21-ae47-448a-b699-73c38c6ef7c8" containerName="registry-server" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.600970 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.603592 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bn7x8" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.603601 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-util\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.603650 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cslb2\" (UniqueName: \"kubernetes.io/projected/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-kube-api-access-cslb2\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.603726 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-bundle\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.609188 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg"] Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.704814 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-util\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.704869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cslb2\" (UniqueName: \"kubernetes.io/projected/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-kube-api-access-cslb2\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.704914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-bundle\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.705365 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-bundle\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.705596 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-util\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.725109 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cslb2\" (UniqueName: \"kubernetes.io/projected/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-kube-api-access-cslb2\") pod \"f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:27 crc kubenswrapper[4732]: I1010 07:06:27.917653 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:28 crc kubenswrapper[4732]: I1010 07:06:28.382868 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg"] Oct 10 07:06:28 crc kubenswrapper[4732]: W1010 07:06:28.390181 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5a9a96_f8cf_4fd0_a8e2_d29db90bfcdc.slice/crio-51957aee8a43e1915fc5e2f7c780111f1618d4a622fa7c1c007da8948e74b7fd WatchSource:0}: Error finding container 51957aee8a43e1915fc5e2f7c780111f1618d4a622fa7c1c007da8948e74b7fd: Status 404 returned error can't find the container with id 51957aee8a43e1915fc5e2f7c780111f1618d4a622fa7c1c007da8948e74b7fd Oct 10 07:06:28 crc kubenswrapper[4732]: I1010 07:06:28.452436 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" event={"ID":"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc","Type":"ContainerStarted","Data":"51957aee8a43e1915fc5e2f7c780111f1618d4a622fa7c1c007da8948e74b7fd"} Oct 10 07:06:29 crc kubenswrapper[4732]: I1010 07:06:29.462728 4732 generic.go:334] "Generic (PLEG): container finished" podID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerID="5c62551bd87fb9d55acbcceb1f19a133704c286e87e419af278972c4e121b6dd" exitCode=0 Oct 10 07:06:29 crc kubenswrapper[4732]: I1010 07:06:29.462768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" event={"ID":"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc","Type":"ContainerDied","Data":"5c62551bd87fb9d55acbcceb1f19a133704c286e87e419af278972c4e121b6dd"} Oct 10 07:06:30 crc kubenswrapper[4732]: I1010 07:06:30.469579 4732 generic.go:334] "Generic (PLEG): container finished" podID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerID="a61144759556678939c1e76c788877a43bf4353393f5b6348a881205965de276" exitCode=0 Oct 10 07:06:30 crc kubenswrapper[4732]: I1010 07:06:30.469914 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" event={"ID":"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc","Type":"ContainerDied","Data":"a61144759556678939c1e76c788877a43bf4353393f5b6348a881205965de276"} Oct 10 07:06:31 crc kubenswrapper[4732]: I1010 07:06:31.477453 4732 generic.go:334] "Generic (PLEG): container finished" podID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerID="2973147f6bdbe7552e1654137ca73238f4a3c964129f5f1365a17ba2040c89d8" exitCode=0 Oct 10 07:06:31 crc kubenswrapper[4732]: I1010 07:06:31.477550 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" event={"ID":"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc","Type":"ContainerDied","Data":"2973147f6bdbe7552e1654137ca73238f4a3c964129f5f1365a17ba2040c89d8"} Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.765350 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.874614 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-bundle\") pod \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.874965 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cslb2\" (UniqueName: \"kubernetes.io/projected/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-kube-api-access-cslb2\") pod \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.874994 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-util\") pod \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\" (UID: \"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc\") " Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.875677 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-bundle" (OuterVolumeSpecName: "bundle") pod "bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" (UID: "bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.881377 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-kube-api-access-cslb2" (OuterVolumeSpecName: "kube-api-access-cslb2") pod "bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" (UID: "bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc"). InnerVolumeSpecName "kube-api-access-cslb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.890557 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-util" (OuterVolumeSpecName: "util") pod "bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" (UID: "bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.975992 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.976028 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cslb2\" (UniqueName: \"kubernetes.io/projected/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-kube-api-access-cslb2\") on node \"crc\" DevicePath \"\"" Oct 10 07:06:32 crc kubenswrapper[4732]: I1010 07:06:32.976042 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc-util\") on node \"crc\" DevicePath \"\"" Oct 10 07:06:33 crc kubenswrapper[4732]: I1010 07:06:33.499781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" event={"ID":"bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc","Type":"ContainerDied","Data":"51957aee8a43e1915fc5e2f7c780111f1618d4a622fa7c1c007da8948e74b7fd"} Oct 10 07:06:33 crc kubenswrapper[4732]: I1010 07:06:33.499827 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51957aee8a43e1915fc5e2f7c780111f1618d4a622fa7c1c007da8948e74b7fd" Oct 10 07:06:33 crc kubenswrapper[4732]: I1010 07:06:33.499833 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.262628 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5"] Oct 10 07:06:40 crc kubenswrapper[4732]: E1010 07:06:40.263209 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerName="pull" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.263221 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerName="pull" Oct 10 07:06:40 crc kubenswrapper[4732]: E1010 07:06:40.263235 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerName="util" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.263241 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerName="util" Oct 10 07:06:40 crc kubenswrapper[4732]: E1010 07:06:40.263253 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerName="extract" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.263259 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerName="extract" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.263361 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc" containerName="extract" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.263953 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.267637 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-ppnpb" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.290050 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5"] Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.371487 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whvrv\" (UniqueName: \"kubernetes.io/projected/7fd96c0c-f327-4080-9429-28e5a10b932a-kube-api-access-whvrv\") pod \"openstack-operator-controller-operator-599bffcb5d-5sws5\" (UID: \"7fd96c0c-f327-4080-9429-28e5a10b932a\") " pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.472684 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whvrv\" (UniqueName: \"kubernetes.io/projected/7fd96c0c-f327-4080-9429-28e5a10b932a-kube-api-access-whvrv\") pod \"openstack-operator-controller-operator-599bffcb5d-5sws5\" (UID: \"7fd96c0c-f327-4080-9429-28e5a10b932a\") " pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.494839 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whvrv\" (UniqueName: \"kubernetes.io/projected/7fd96c0c-f327-4080-9429-28e5a10b932a-kube-api-access-whvrv\") pod \"openstack-operator-controller-operator-599bffcb5d-5sws5\" (UID: \"7fd96c0c-f327-4080-9429-28e5a10b932a\") " pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.583029 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" Oct 10 07:06:40 crc kubenswrapper[4732]: I1010 07:06:40.797616 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5"] Oct 10 07:06:40 crc kubenswrapper[4732]: W1010 07:06:40.807671 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd96c0c_f327_4080_9429_28e5a10b932a.slice/crio-c8a111982f24d625748c69614d73ad1458852af946149bd79cf8b72d4c1670d8 WatchSource:0}: Error finding container c8a111982f24d625748c69614d73ad1458852af946149bd79cf8b72d4c1670d8: Status 404 returned error can't find the container with id c8a111982f24d625748c69614d73ad1458852af946149bd79cf8b72d4c1670d8 Oct 10 07:06:41 crc kubenswrapper[4732]: I1010 07:06:41.552421 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" event={"ID":"7fd96c0c-f327-4080-9429-28e5a10b932a","Type":"ContainerStarted","Data":"c8a111982f24d625748c69614d73ad1458852af946149bd79cf8b72d4c1670d8"} Oct 10 07:06:45 crc kubenswrapper[4732]: I1010 07:06:45.595000 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" event={"ID":"7fd96c0c-f327-4080-9429-28e5a10b932a","Type":"ContainerStarted","Data":"85a0724565e3e80fec0a83fabdf917bc22b05a7be1a4cfaa2f92be88f064c5c5"} Oct 10 07:06:48 crc kubenswrapper[4732]: I1010 07:06:48.615877 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" event={"ID":"7fd96c0c-f327-4080-9429-28e5a10b932a","Type":"ContainerStarted","Data":"cac617b4023a41abd95032884fb5f974b1d4bdb80eddbaa48c03c7c54181b6df"} Oct 10 07:06:48 crc kubenswrapper[4732]: I1010 07:06:48.616661 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" Oct 10 07:06:48 crc kubenswrapper[4732]: I1010 07:06:48.655676 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" podStartSLOduration=1.742947325 podStartE2EDuration="8.655646603s" podCreationTimestamp="2025-10-10 07:06:40 +0000 UTC" firstStartedPulling="2025-10-10 07:06:40.813414689 +0000 UTC m=+927.883005930" lastFinishedPulling="2025-10-10 07:06:47.726113967 +0000 UTC m=+934.795705208" observedRunningTime="2025-10-10 07:06:48.6496482 +0000 UTC m=+935.719239531" watchObservedRunningTime="2025-10-10 07:06:48.655646603 +0000 UTC m=+935.725237884" Oct 10 07:06:50 crc kubenswrapper[4732]: I1010 07:06:50.586447 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-599bffcb5d-5sws5" Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.356516 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.357282 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.357370 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.358529 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91a09d30e877ca33916c77e5509ae9d7f46220996d3447c91784df288ae8e1b0"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.358654 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://91a09d30e877ca33916c77e5509ae9d7f46220996d3447c91784df288ae8e1b0" gracePeriod=600 Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.673399 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="91a09d30e877ca33916c77e5509ae9d7f46220996d3447c91784df288ae8e1b0" exitCode=0 Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.673462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"91a09d30e877ca33916c77e5509ae9d7f46220996d3447c91784df288ae8e1b0"} Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.673786 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"239755fce5a5e3e2f7099222145032c91288a72b835ff1a7f25dd0d9f8c8d6b0"} Oct 10 07:06:55 crc kubenswrapper[4732]: I1010 07:06:55.673814 4732 scope.go:117] "RemoveContainer" containerID="53a6572dbd94b6a842e7d21f0e7b30a3d933ba27c7ff2433ef57f5a4d47b6c8d" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.895837 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.898630 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.901979 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-l6649" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.912404 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.925139 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.926234 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.935296 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xc88h" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.950025 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.950298 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47z58\" (UniqueName: \"kubernetes.io/projected/eb7804d7-814a-4aeb-b9d5-b359cada4441-kube-api-access-47z58\") pod \"cinder-operator-controller-manager-7b7fb68549-xt4c5\" (UID: \"eb7804d7-814a-4aeb-b9d5-b359cada4441\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.950365 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgxj\" (UniqueName: \"kubernetes.io/projected/ebf07c0d-be2b-41ec-8363-bdfcc2d3802a-kube-api-access-zhgxj\") pod \"barbican-operator-controller-manager-658bdf4b74-b54wh\" (UID: \"ebf07c0d-be2b-41ec-8363-bdfcc2d3802a\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.962057 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.963073 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.966654 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.967655 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.970108 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vvcmd" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.970195 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2xz9l" Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.981241 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.990325 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2"] Oct 10 07:07:06 crc kubenswrapper[4732]: I1010 07:07:06.999900 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.001034 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.003536 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vrksx" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.007779 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.008782 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.010964 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-8cqs8" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.011773 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.017392 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.053454 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47z58\" (UniqueName: \"kubernetes.io/projected/eb7804d7-814a-4aeb-b9d5-b359cada4441-kube-api-access-47z58\") pod \"cinder-operator-controller-manager-7b7fb68549-xt4c5\" (UID: \"eb7804d7-814a-4aeb-b9d5-b359cada4441\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.053546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgxj\" (UniqueName: \"kubernetes.io/projected/ebf07c0d-be2b-41ec-8363-bdfcc2d3802a-kube-api-access-zhgxj\") pod \"barbican-operator-controller-manager-658bdf4b74-b54wh\" (UID: \"ebf07c0d-be2b-41ec-8363-bdfcc2d3802a\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.054648 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.055569 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.057578 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-xgl5j" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.057770 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.062370 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.063493 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.065005 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-x9gt9" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.080522 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.092217 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.093109 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.093221 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.094825 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ghst5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.097958 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.108417 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgxj\" (UniqueName: \"kubernetes.io/projected/ebf07c0d-be2b-41ec-8363-bdfcc2d3802a-kube-api-access-zhgxj\") pod \"barbican-operator-controller-manager-658bdf4b74-b54wh\" (UID: \"ebf07c0d-be2b-41ec-8363-bdfcc2d3802a\") " pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.112113 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47z58\" (UniqueName: \"kubernetes.io/projected/eb7804d7-814a-4aeb-b9d5-b359cada4441-kube-api-access-47z58\") pod \"cinder-operator-controller-manager-7b7fb68549-xt4c5\" (UID: \"eb7804d7-814a-4aeb-b9d5-b359cada4441\") " pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.154491 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155711 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldkhh\" (UniqueName: \"kubernetes.io/projected/edbb64bd-a7b0-40ea-90e2-7cc1fee46f76-kube-api-access-ldkhh\") pod \"horizon-operator-controller-manager-7ffbcb7588-s8qhg\" (UID: \"edbb64bd-a7b0-40ea-90e2-7cc1fee46f76\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155754 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6gqf\" (UniqueName: \"kubernetes.io/projected/ac7e4be6-eb30-4fec-bb28-8f7181d7d337-kube-api-access-p6gqf\") pod \"heat-operator-controller-manager-858f76bbdd-gvtth\" (UID: \"ac7e4be6-eb30-4fec-bb28-8f7181d7d337\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155809 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fvm\" (UniqueName: \"kubernetes.io/projected/6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b-kube-api-access-n9fvm\") pod \"keystone-operator-controller-manager-55b6b7c7b8-zz2vz\" (UID: \"6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155859 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqc69\" (UniqueName: \"kubernetes.io/projected/579b1489-9552-485c-92da-5386e7b2afeb-kube-api-access-vqc69\") pod \"designate-operator-controller-manager-85d5d9dd78-pbxl8\" (UID: \"579b1489-9552-485c-92da-5386e7b2afeb\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155899 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdm6w\" (UniqueName: \"kubernetes.io/projected/455baf3e-9434-4962-93bb-cd6497747fa5-kube-api-access-xdm6w\") pod \"glance-operator-controller-manager-84b9b84486-9zhd2\" (UID: \"455baf3e-9434-4962-93bb-cd6497747fa5\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvpnn\" (UniqueName: \"kubernetes.io/projected/33631d8c-63c6-4912-be80-748b6c997cae-kube-api-access-zvpnn\") pod \"ironic-operator-controller-manager-9c5c78d49-dpw4c\" (UID: \"33631d8c-63c6-4912-be80-748b6c997cae\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155941 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da772cc-f90b-4ee7-8793-2fd804249c91-cert\") pod \"infra-operator-controller-manager-656bcbd775-nztbg\" (UID: \"0da772cc-f90b-4ee7-8793-2fd804249c91\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.155992 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq7zs\" (UniqueName: \"kubernetes.io/projected/0da772cc-f90b-4ee7-8793-2fd804249c91-kube-api-access-kq7zs\") pod \"infra-operator-controller-manager-656bcbd775-nztbg\" (UID: \"0da772cc-f90b-4ee7-8793-2fd804249c91\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.157324 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.166158 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-925h5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.176024 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.199872 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.201156 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.206847 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qb5wv" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.211390 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.212601 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.219029 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2tt45" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.222948 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.235809 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.236445 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.242565 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.243844 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.247879 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-jqv9k" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.250983 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq7zs\" (UniqueName: \"kubernetes.io/projected/0da772cc-f90b-4ee7-8793-2fd804249c91-kube-api-access-kq7zs\") pod \"infra-operator-controller-manager-656bcbd775-nztbg\" (UID: \"0da772cc-f90b-4ee7-8793-2fd804249c91\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258429 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldkhh\" (UniqueName: \"kubernetes.io/projected/edbb64bd-a7b0-40ea-90e2-7cc1fee46f76-kube-api-access-ldkhh\") pod \"horizon-operator-controller-manager-7ffbcb7588-s8qhg\" (UID: \"edbb64bd-a7b0-40ea-90e2-7cc1fee46f76\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258447 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6gqf\" (UniqueName: \"kubernetes.io/projected/ac7e4be6-eb30-4fec-bb28-8f7181d7d337-kube-api-access-p6gqf\") pod \"heat-operator-controller-manager-858f76bbdd-gvtth\" (UID: \"ac7e4be6-eb30-4fec-bb28-8f7181d7d337\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258468 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fvm\" (UniqueName: \"kubernetes.io/projected/6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b-kube-api-access-n9fvm\") pod \"keystone-operator-controller-manager-55b6b7c7b8-zz2vz\" (UID: \"6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258487 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqc69\" (UniqueName: \"kubernetes.io/projected/579b1489-9552-485c-92da-5386e7b2afeb-kube-api-access-vqc69\") pod \"designate-operator-controller-manager-85d5d9dd78-pbxl8\" (UID: \"579b1489-9552-485c-92da-5386e7b2afeb\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258517 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdm6w\" (UniqueName: \"kubernetes.io/projected/455baf3e-9434-4962-93bb-cd6497747fa5-kube-api-access-xdm6w\") pod \"glance-operator-controller-manager-84b9b84486-9zhd2\" (UID: \"455baf3e-9434-4962-93bb-cd6497747fa5\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvpnn\" (UniqueName: \"kubernetes.io/projected/33631d8c-63c6-4912-be80-748b6c997cae-kube-api-access-zvpnn\") pod \"ironic-operator-controller-manager-9c5c78d49-dpw4c\" (UID: \"33631d8c-63c6-4912-be80-748b6c997cae\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.258556 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da772cc-f90b-4ee7-8793-2fd804249c91-cert\") pod \"infra-operator-controller-manager-656bcbd775-nztbg\" (UID: \"0da772cc-f90b-4ee7-8793-2fd804249c91\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: E1010 07:07:07.258654 4732 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 10 07:07:07 crc kubenswrapper[4732]: E1010 07:07:07.258723 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da772cc-f90b-4ee7-8793-2fd804249c91-cert podName:0da772cc-f90b-4ee7-8793-2fd804249c91 nodeName:}" failed. No retries permitted until 2025-10-10 07:07:07.758680479 +0000 UTC m=+954.828271720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da772cc-f90b-4ee7-8793-2fd804249c91-cert") pod "infra-operator-controller-manager-656bcbd775-nztbg" (UID: "0da772cc-f90b-4ee7-8793-2fd804249c91") : secret "infra-operator-webhook-server-cert" not found Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.264781 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.267041 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.268169 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.271131 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zxgp8" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.326758 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.327788 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldkhh\" (UniqueName: \"kubernetes.io/projected/edbb64bd-a7b0-40ea-90e2-7cc1fee46f76-kube-api-access-ldkhh\") pod \"horizon-operator-controller-manager-7ffbcb7588-s8qhg\" (UID: \"edbb64bd-a7b0-40ea-90e2-7cc1fee46f76\") " pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.336399 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.342505 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqc69\" (UniqueName: \"kubernetes.io/projected/579b1489-9552-485c-92da-5386e7b2afeb-kube-api-access-vqc69\") pod \"designate-operator-controller-manager-85d5d9dd78-pbxl8\" (UID: \"579b1489-9552-485c-92da-5386e7b2afeb\") " pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.342676 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdm6w\" (UniqueName: \"kubernetes.io/projected/455baf3e-9434-4962-93bb-cd6497747fa5-kube-api-access-xdm6w\") pod \"glance-operator-controller-manager-84b9b84486-9zhd2\" (UID: \"455baf3e-9434-4962-93bb-cd6497747fa5\") " pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.347109 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvpnn\" (UniqueName: \"kubernetes.io/projected/33631d8c-63c6-4912-be80-748b6c997cae-kube-api-access-zvpnn\") pod \"ironic-operator-controller-manager-9c5c78d49-dpw4c\" (UID: \"33631d8c-63c6-4912-be80-748b6c997cae\") " pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.349125 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6gqf\" (UniqueName: \"kubernetes.io/projected/ac7e4be6-eb30-4fec-bb28-8f7181d7d337-kube-api-access-p6gqf\") pod \"heat-operator-controller-manager-858f76bbdd-gvtth\" (UID: \"ac7e4be6-eb30-4fec-bb28-8f7181d7d337\") " pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.354144 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq7zs\" (UniqueName: \"kubernetes.io/projected/0da772cc-f90b-4ee7-8793-2fd804249c91-kube-api-access-kq7zs\") pod \"infra-operator-controller-manager-656bcbd775-nztbg\" (UID: \"0da772cc-f90b-4ee7-8793-2fd804249c91\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.360159 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtq2x\" (UniqueName: \"kubernetes.io/projected/7ed8853a-5e8a-4dce-abb2-73bc7375a2bb-kube-api-access-mtq2x\") pod \"neutron-operator-controller-manager-79d585cb66-4hzzt\" (UID: \"7ed8853a-5e8a-4dce-abb2-73bc7375a2bb\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.360260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdl66\" (UniqueName: \"kubernetes.io/projected/0f893cf0-2c81-455d-a447-b0745e767b18-kube-api-access-kdl66\") pod \"mariadb-operator-controller-manager-f9fb45f8f-42jm6\" (UID: \"0f893cf0-2c81-455d-a447-b0745e767b18\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.360290 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhkjh\" (UniqueName: \"kubernetes.io/projected/de2d002c-ff31-4c5b-aaa6-9e19c00caf6c-kube-api-access-fhkjh\") pod \"manila-operator-controller-manager-5f67fbc655-2knl5\" (UID: \"de2d002c-ff31-4c5b-aaa6-9e19c00caf6c\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.360346 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld4k6\" (UniqueName: \"kubernetes.io/projected/48a56683-0762-4720-9640-c2b4e9ffb277-kube-api-access-ld4k6\") pod \"nova-operator-controller-manager-5df598886f-s5qjg\" (UID: \"48a56683-0762-4720-9640-c2b4e9ffb277\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.362635 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.365458 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.381515 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fvm\" (UniqueName: \"kubernetes.io/projected/6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b-kube-api-access-n9fvm\") pod \"keystone-operator-controller-manager-55b6b7c7b8-zz2vz\" (UID: \"6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b\") " pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.388175 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.388389 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-4k6gn" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.409765 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.411928 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.415175 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-p4dct" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.435039 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.451357 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.452805 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.456417 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9qfz5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.458766 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463432 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463639 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtq2x\" (UniqueName: \"kubernetes.io/projected/7ed8853a-5e8a-4dce-abb2-73bc7375a2bb-kube-api-access-mtq2x\") pod \"neutron-operator-controller-manager-79d585cb66-4hzzt\" (UID: \"7ed8853a-5e8a-4dce-abb2-73bc7375a2bb\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463747 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5br8\" (UniqueName: \"kubernetes.io/projected/38e49d40-f9b0-476b-a875-891fdb26d8fc-kube-api-access-l5br8\") pod \"ovn-operator-controller-manager-79df5fb58c-sl8ht\" (UID: \"38e49d40-f9b0-476b-a875-891fdb26d8fc\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxcj\" (UniqueName: \"kubernetes.io/projected/9c145bb1-292a-4675-be8c-9bd49d4034f2-kube-api-access-ktxcj\") pod \"octavia-operator-controller-manager-69fdcfc5f5-mx78z\" (UID: \"9c145bb1-292a-4675-be8c-9bd49d4034f2\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463826 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdl66\" (UniqueName: \"kubernetes.io/projected/0f893cf0-2c81-455d-a447-b0745e767b18-kube-api-access-kdl66\") pod \"mariadb-operator-controller-manager-f9fb45f8f-42jm6\" (UID: \"0f893cf0-2c81-455d-a447-b0745e767b18\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463849 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d99c70c9-9474-4418-8030-df6d871283e7-cert\") pod \"openstack-baremetal-operator-controller-manager-84c868ff4cfg79v\" (UID: \"d99c70c9-9474-4418-8030-df6d871283e7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463882 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhkjh\" (UniqueName: \"kubernetes.io/projected/de2d002c-ff31-4c5b-aaa6-9e19c00caf6c-kube-api-access-fhkjh\") pod \"manila-operator-controller-manager-5f67fbc655-2knl5\" (UID: \"de2d002c-ff31-4c5b-aaa6-9e19c00caf6c\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463914 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgzx\" (UniqueName: \"kubernetes.io/projected/d99c70c9-9474-4418-8030-df6d871283e7-kube-api-access-vjgzx\") pod \"openstack-baremetal-operator-controller-manager-84c868ff4cfg79v\" (UID: \"d99c70c9-9474-4418-8030-df6d871283e7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463968 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g79hp\" (UniqueName: \"kubernetes.io/projected/20e2b6da-45d9-40d7-8a93-05cc865543c6-kube-api-access-g79hp\") pod \"placement-operator-controller-manager-68b6c87b68-cbf4m\" (UID: \"20e2b6da-45d9-40d7-8a93-05cc865543c6\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.463999 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld4k6\" (UniqueName: \"kubernetes.io/projected/48a56683-0762-4720-9640-c2b4e9ffb277-kube-api-access-ld4k6\") pod \"nova-operator-controller-manager-5df598886f-s5qjg\" (UID: \"48a56683-0762-4720-9640-c2b4e9ffb277\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.465113 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.465528 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rv97c" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.466574 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.476673 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5f5rt" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.476888 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.495342 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.497612 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdl66\" (UniqueName: \"kubernetes.io/projected/0f893cf0-2c81-455d-a447-b0745e767b18-kube-api-access-kdl66\") pod \"mariadb-operator-controller-manager-f9fb45f8f-42jm6\" (UID: \"0f893cf0-2c81-455d-a447-b0745e767b18\") " pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.501956 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.502153 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld4k6\" (UniqueName: \"kubernetes.io/projected/48a56683-0762-4720-9640-c2b4e9ffb277-kube-api-access-ld4k6\") pod \"nova-operator-controller-manager-5df598886f-s5qjg\" (UID: \"48a56683-0762-4720-9640-c2b4e9ffb277\") " pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.509527 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtq2x\" (UniqueName: \"kubernetes.io/projected/7ed8853a-5e8a-4dce-abb2-73bc7375a2bb-kube-api-access-mtq2x\") pod \"neutron-operator-controller-manager-79d585cb66-4hzzt\" (UID: \"7ed8853a-5e8a-4dce-abb2-73bc7375a2bb\") " pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.513312 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.536957 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.541043 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhkjh\" (UniqueName: \"kubernetes.io/projected/de2d002c-ff31-4c5b-aaa6-9e19c00caf6c-kube-api-access-fhkjh\") pod \"manila-operator-controller-manager-5f67fbc655-2knl5\" (UID: \"de2d002c-ff31-4c5b-aaa6-9e19c00caf6c\") " pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.544830 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.548277 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.565427 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.565679 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.566473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5br8\" (UniqueName: \"kubernetes.io/projected/38e49d40-f9b0-476b-a875-891fdb26d8fc-kube-api-access-l5br8\") pod \"ovn-operator-controller-manager-79df5fb58c-sl8ht\" (UID: \"38e49d40-f9b0-476b-a875-891fdb26d8fc\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.566495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxcj\" (UniqueName: \"kubernetes.io/projected/9c145bb1-292a-4675-be8c-9bd49d4034f2-kube-api-access-ktxcj\") pod \"octavia-operator-controller-manager-69fdcfc5f5-mx78z\" (UID: \"9c145bb1-292a-4675-be8c-9bd49d4034f2\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.566525 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d99c70c9-9474-4418-8030-df6d871283e7-cert\") pod \"openstack-baremetal-operator-controller-manager-84c868ff4cfg79v\" (UID: \"d99c70c9-9474-4418-8030-df6d871283e7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.566552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgzx\" (UniqueName: \"kubernetes.io/projected/d99c70c9-9474-4418-8030-df6d871283e7-kube-api-access-vjgzx\") pod \"openstack-baremetal-operator-controller-manager-84c868ff4cfg79v\" (UID: \"d99c70c9-9474-4418-8030-df6d871283e7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.566590 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g79hp\" (UniqueName: \"kubernetes.io/projected/20e2b6da-45d9-40d7-8a93-05cc865543c6-kube-api-access-g79hp\") pod \"placement-operator-controller-manager-68b6c87b68-cbf4m\" (UID: \"20e2b6da-45d9-40d7-8a93-05cc865543c6\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" Oct 10 07:07:07 crc kubenswrapper[4732]: E1010 07:07:07.567807 4732 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 10 07:07:07 crc kubenswrapper[4732]: E1010 07:07:07.567874 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d99c70c9-9474-4418-8030-df6d871283e7-cert podName:d99c70c9-9474-4418-8030-df6d871283e7 nodeName:}" failed. No retries permitted until 2025-10-10 07:07:08.067855219 +0000 UTC m=+955.137446460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d99c70c9-9474-4418-8030-df6d871283e7-cert") pod "openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" (UID: "d99c70c9-9474-4418-8030-df6d871283e7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.580488 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.589155 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.597770 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.598898 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.606314 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgzx\" (UniqueName: \"kubernetes.io/projected/d99c70c9-9474-4418-8030-df6d871283e7-kube-api-access-vjgzx\") pod \"openstack-baremetal-operator-controller-manager-84c868ff4cfg79v\" (UID: \"d99c70c9-9474-4418-8030-df6d871283e7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.614025 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-f8jdj" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.620142 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.629492 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g79hp\" (UniqueName: \"kubernetes.io/projected/20e2b6da-45d9-40d7-8a93-05cc865543c6-kube-api-access-g79hp\") pod \"placement-operator-controller-manager-68b6c87b68-cbf4m\" (UID: \"20e2b6da-45d9-40d7-8a93-05cc865543c6\") " pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.629551 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxcj\" (UniqueName: \"kubernetes.io/projected/9c145bb1-292a-4675-be8c-9bd49d4034f2-kube-api-access-ktxcj\") pod \"octavia-operator-controller-manager-69fdcfc5f5-mx78z\" (UID: \"9c145bb1-292a-4675-be8c-9bd49d4034f2\") " pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.630382 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5br8\" (UniqueName: \"kubernetes.io/projected/38e49d40-f9b0-476b-a875-891fdb26d8fc-kube-api-access-l5br8\") pod \"ovn-operator-controller-manager-79df5fb58c-sl8ht\" (UID: \"38e49d40-f9b0-476b-a875-891fdb26d8fc\") " pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.661983 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.668250 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58sn7\" (UniqueName: \"kubernetes.io/projected/9744ec37-6d1a-4b31-b443-26ef804824f3-kube-api-access-58sn7\") pod \"test-operator-controller-manager-5458f77c4-j8cfc\" (UID: \"9744ec37-6d1a-4b31-b443-26ef804824f3\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.668296 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8z4m\" (UniqueName: \"kubernetes.io/projected/af04a067-8a30-4d2d-a0ff-b3206375d952-kube-api-access-q8z4m\") pod \"swift-operator-controller-manager-db6d7f97b-8zx8p\" (UID: \"af04a067-8a30-4d2d-a0ff-b3206375d952\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.668328 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbl4\" (UniqueName: \"kubernetes.io/projected/a0ce9c8b-219c-40da-bc5b-b171446c36ba-kube-api-access-llbl4\") pod \"telemetry-operator-controller-manager-67cfc6749b-mhkk2\" (UID: \"a0ce9c8b-219c-40da-bc5b-b171446c36ba\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.679339 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.718232 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.719391 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.733570 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vw2wc" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.745166 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.769514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppw7t\" (UniqueName: \"kubernetes.io/projected/6913b400-809d-4aa7-b478-999c34cdf0da-kube-api-access-ppw7t\") pod \"watcher-operator-controller-manager-7f554bff7b-26jhx\" (UID: \"6913b400-809d-4aa7-b478-999c34cdf0da\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.769599 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da772cc-f90b-4ee7-8793-2fd804249c91-cert\") pod \"infra-operator-controller-manager-656bcbd775-nztbg\" (UID: \"0da772cc-f90b-4ee7-8793-2fd804249c91\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.769648 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58sn7\" (UniqueName: \"kubernetes.io/projected/9744ec37-6d1a-4b31-b443-26ef804824f3-kube-api-access-58sn7\") pod \"test-operator-controller-manager-5458f77c4-j8cfc\" (UID: \"9744ec37-6d1a-4b31-b443-26ef804824f3\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.769706 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8z4m\" (UniqueName: \"kubernetes.io/projected/af04a067-8a30-4d2d-a0ff-b3206375d952-kube-api-access-q8z4m\") pod \"swift-operator-controller-manager-db6d7f97b-8zx8p\" (UID: \"af04a067-8a30-4d2d-a0ff-b3206375d952\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.769741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llbl4\" (UniqueName: \"kubernetes.io/projected/a0ce9c8b-219c-40da-bc5b-b171446c36ba-kube-api-access-llbl4\") pod \"telemetry-operator-controller-manager-67cfc6749b-mhkk2\" (UID: \"a0ce9c8b-219c-40da-bc5b-b171446c36ba\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.776129 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.791470 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da772cc-f90b-4ee7-8793-2fd804249c91-cert\") pod \"infra-operator-controller-manager-656bcbd775-nztbg\" (UID: \"0da772cc-f90b-4ee7-8793-2fd804249c91\") " pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.792220 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.809570 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbl4\" (UniqueName: \"kubernetes.io/projected/a0ce9c8b-219c-40da-bc5b-b171446c36ba-kube-api-access-llbl4\") pod \"telemetry-operator-controller-manager-67cfc6749b-mhkk2\" (UID: \"a0ce9c8b-219c-40da-bc5b-b171446c36ba\") " pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.814414 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8z4m\" (UniqueName: \"kubernetes.io/projected/af04a067-8a30-4d2d-a0ff-b3206375d952-kube-api-access-q8z4m\") pod \"swift-operator-controller-manager-db6d7f97b-8zx8p\" (UID: \"af04a067-8a30-4d2d-a0ff-b3206375d952\") " pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.815942 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.821846 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.824290 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.827892 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-m7l6k" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.828038 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58sn7\" (UniqueName: \"kubernetes.io/projected/9744ec37-6d1a-4b31-b443-26ef804824f3-kube-api-access-58sn7\") pod \"test-operator-controller-manager-5458f77c4-j8cfc\" (UID: \"9744ec37-6d1a-4b31-b443-26ef804824f3\") " pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.828087 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.834761 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.838045 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.838947 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.842808 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf"] Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.846394 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ztxmm" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.848547 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.872547 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hmsx\" (UniqueName: \"kubernetes.io/projected/6f554597-7d00-422a-b570-834795047cf9-kube-api-access-9hmsx\") pod \"openstack-operator-controller-manager-5698bb9464-8qpcv\" (UID: \"6f554597-7d00-422a-b570-834795047cf9\") " pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.872675 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppw7t\" (UniqueName: \"kubernetes.io/projected/6913b400-809d-4aa7-b478-999c34cdf0da-kube-api-access-ppw7t\") pod \"watcher-operator-controller-manager-7f554bff7b-26jhx\" (UID: \"6913b400-809d-4aa7-b478-999c34cdf0da\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.872810 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f554597-7d00-422a-b570-834795047cf9-cert\") pod \"openstack-operator-controller-manager-5698bb9464-8qpcv\" (UID: \"6f554597-7d00-422a-b570-834795047cf9\") " pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.872875 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzgcv\" (UniqueName: \"kubernetes.io/projected/26800fd4-b33e-4bb8-815b-5ec03fc9b22b-kube-api-access-wzgcv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf\" (UID: \"26800fd4-b33e-4bb8-815b-5ec03fc9b22b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.881246 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.887629 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppw7t\" (UniqueName: \"kubernetes.io/projected/6913b400-809d-4aa7-b478-999c34cdf0da-kube-api-access-ppw7t\") pod \"watcher-operator-controller-manager-7f554bff7b-26jhx\" (UID: \"6913b400-809d-4aa7-b478-999c34cdf0da\") " pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.968145 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.981939 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f554597-7d00-422a-b570-834795047cf9-cert\") pod \"openstack-operator-controller-manager-5698bb9464-8qpcv\" (UID: \"6f554597-7d00-422a-b570-834795047cf9\") " pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.982023 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzgcv\" (UniqueName: \"kubernetes.io/projected/26800fd4-b33e-4bb8-815b-5ec03fc9b22b-kube-api-access-wzgcv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf\" (UID: \"26800fd4-b33e-4bb8-815b-5ec03fc9b22b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" Oct 10 07:07:07 crc kubenswrapper[4732]: I1010 07:07:07.982093 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hmsx\" (UniqueName: \"kubernetes.io/projected/6f554597-7d00-422a-b570-834795047cf9-kube-api-access-9hmsx\") pod \"openstack-operator-controller-manager-5698bb9464-8qpcv\" (UID: \"6f554597-7d00-422a-b570-834795047cf9\") " pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:07 crc kubenswrapper[4732]: E1010 07:07:07.982492 4732 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 10 07:07:07 crc kubenswrapper[4732]: E1010 07:07:07.982540 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f554597-7d00-422a-b570-834795047cf9-cert podName:6f554597-7d00-422a-b570-834795047cf9 nodeName:}" failed. No retries permitted until 2025-10-10 07:07:08.48252402 +0000 UTC m=+955.552115261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f554597-7d00-422a-b570-834795047cf9-cert") pod "openstack-operator-controller-manager-5698bb9464-8qpcv" (UID: "6f554597-7d00-422a-b570-834795047cf9") : secret "webhook-server-cert" not found Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.013105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hmsx\" (UniqueName: \"kubernetes.io/projected/6f554597-7d00-422a-b570-834795047cf9-kube-api-access-9hmsx\") pod \"openstack-operator-controller-manager-5698bb9464-8qpcv\" (UID: \"6f554597-7d00-422a-b570-834795047cf9\") " pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.020161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzgcv\" (UniqueName: \"kubernetes.io/projected/26800fd4-b33e-4bb8-815b-5ec03fc9b22b-kube-api-access-wzgcv\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf\" (UID: \"26800fd4-b33e-4bb8-815b-5ec03fc9b22b\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.064116 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.064920 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.082488 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d99c70c9-9474-4418-8030-df6d871283e7-cert\") pod \"openstack-baremetal-operator-controller-manager-84c868ff4cfg79v\" (UID: \"d99c70c9-9474-4418-8030-df6d871283e7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.085354 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d99c70c9-9474-4418-8030-df6d871283e7-cert\") pod \"openstack-baremetal-operator-controller-manager-84c868ff4cfg79v\" (UID: \"d99c70c9-9474-4418-8030-df6d871283e7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.137814 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg"] Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.165382 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5"] Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.234438 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.323003 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.338894 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh"] Oct 10 07:07:08 crc kubenswrapper[4732]: W1010 07:07:08.386670 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf07c0d_be2b_41ec_8363_bdfcc2d3802a.slice/crio-03b29f333b492b987a1d6b9dabfda1e8b8b52d41aba7179c25942d7a9bda26e6 WatchSource:0}: Error finding container 03b29f333b492b987a1d6b9dabfda1e8b8b52d41aba7179c25942d7a9bda26e6: Status 404 returned error can't find the container with id 03b29f333b492b987a1d6b9dabfda1e8b8b52d41aba7179c25942d7a9bda26e6 Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.506899 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f554597-7d00-422a-b570-834795047cf9-cert\") pod \"openstack-operator-controller-manager-5698bb9464-8qpcv\" (UID: \"6f554597-7d00-422a-b570-834795047cf9\") " pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.521915 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f554597-7d00-422a-b570-834795047cf9-cert\") pod \"openstack-operator-controller-manager-5698bb9464-8qpcv\" (UID: \"6f554597-7d00-422a-b570-834795047cf9\") " pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.801137 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" event={"ID":"edbb64bd-a7b0-40ea-90e2-7cc1fee46f76","Type":"ContainerStarted","Data":"6b399124086d967fd0d95f67d76edb59cd193ed7ada091226ac0cc5ad2a5dc20"} Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.802234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" event={"ID":"eb7804d7-814a-4aeb-b9d5-b359cada4441","Type":"ContainerStarted","Data":"ec801912ab6afad1f023283dcc910a299995f8eb2e031b401deeb71df94f00c2"} Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.804822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" event={"ID":"ebf07c0d-be2b-41ec-8363-bdfcc2d3802a","Type":"ContainerStarted","Data":"03b29f333b492b987a1d6b9dabfda1e8b8b52d41aba7179c25942d7a9bda26e6"} Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.812869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.949115 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt"] Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.954163 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6"] Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.960180 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg"] Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.976527 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8"] Oct 10 07:07:08 crc kubenswrapper[4732]: W1010 07:07:08.981254 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33631d8c_63c6_4912_be80_748b6c997cae.slice/crio-8b0e561029b33745fdae456fddaabbd433050e11cba984d2297206b800d0ebeb WatchSource:0}: Error finding container 8b0e561029b33745fdae456fddaabbd433050e11cba984d2297206b800d0ebeb: Status 404 returned error can't find the container with id 8b0e561029b33745fdae456fddaabbd433050e11cba984d2297206b800d0ebeb Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.983074 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c"] Oct 10 07:07:08 crc kubenswrapper[4732]: W1010 07:07:08.983445 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod579b1489_9552_485c_92da_5386e7b2afeb.slice/crio-1fe4b3d8d8fdccceb29cfd2a1bea741d2dae27f44f42f3fc6d071d6c8d3e613d WatchSource:0}: Error finding container 1fe4b3d8d8fdccceb29cfd2a1bea741d2dae27f44f42f3fc6d071d6c8d3e613d: Status 404 returned error can't find the container with id 1fe4b3d8d8fdccceb29cfd2a1bea741d2dae27f44f42f3fc6d071d6c8d3e613d Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.991790 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz"] Oct 10 07:07:08 crc kubenswrapper[4732]: I1010 07:07:08.996626 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z"] Oct 10 07:07:09 crc kubenswrapper[4732]: W1010 07:07:09.008009 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8b2f0b_d16d_4a9a_a3a1_dfc2547a061b.slice/crio-29b9800f2dd75a9326ee9069e0a064a3e1f86bcec5dd04e50a6fa082e960169f WatchSource:0}: Error finding container 29b9800f2dd75a9326ee9069e0a064a3e1f86bcec5dd04e50a6fa082e960169f: Status 404 returned error can't find the container with id 29b9800f2dd75a9326ee9069e0a064a3e1f86bcec5dd04e50a6fa082e960169f Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.033122 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.056412 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.060975 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.444474 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.477787 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.483315 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.502893 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.526366 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.533447 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.541458 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.542845 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc"] Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.548112 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:03b4f3db4b373515f7e4095984b97197c05a14f87b2a0a525eb5d7be1d7bda66,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:6722a752fb7cbffbae811f6ad6567120fbd4ebbe8c38a83ec2df02850a3276bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:2115452234aedb505ed4efc6cd9b9a4ce3b9809aa7d0128d8fbeeee84dad1a69,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:50597a8eaa6c4383f357574dcab8358b698729797b4156d932985a08ab86b7cd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:cb4997d62c7b2534233a676cb92e19cf85dda07e2fb9fa642c28aab30489f69a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:1ccbf3f6cf24c9ee91bed71467491e22b8cb4b95bce90250f4174fae936b0fa1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:e91d58021b54c46883595ff66be65882de54abdb3be2ca53c4162b20d18b5f48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:cbe345acb37e57986ecf6685d28c72d0e639bdb493a18e9d3ba947d6c3a16384,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:e7dcc3bf23d5e0393ac173e3c43d4ae85f4613a4fd16b3c147dc32ae491d49bf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:2a1a8b582c6e4cc31081bd8b0887acf45e31c1d14596c4e361d27d08fef0debf,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:86daeb9c834bfcedb533086dff59a6b5b6e832b94ce2a9116337f8736bb80032,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:6d28de018f6e1672e775a75735e3bc16b63da41acd8fb5196ee0b06856c07133,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:c5fc9b72fc593bcf3b569c7ed24a256448eb1afab1504e668a3822e978be1306,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:88b99249f15470f359fb554f7f3a56974b743f4655e3f0c982c0260f75a67697,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:e861d66785047d39eb68d9bac23e3f57ac84d9bd95593502d9b3b913b99fd1a4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:b95f09bf3d259f9eacf3b63931977483f5c3c332f49b95ee8a69d8e3fb71d082,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:6fc7801c0d18d41b9f11484b1cdb342de9cebd93072ec2205dbe40945715184f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:d4d824b80cbed683543d9e8c7045ac97e080774f45a5067ccbca26404e067821,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:182ec75938d8d3fb7d8f916373368add24062fec90489aa57776a81d0b36ea20,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:9507ba5ab74cbae902e2dc07f89c7b3b5b76d8079e444365fe0eee6000fd7aaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:17db080dcc4099f8a20aa0f238b6bca5c104672ae46743adeab9d1637725ecaa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:fd55cf3d73bfdc518419c9ba0b0cbef275140ae2d3bd0342a7310f81d57c2d78,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:d164a9bd383f50df69fc22e7422f4650cd5076c90ed19278fc0f04e54345a63d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:6beffe7d0bd75f9d1f495aeb7ab2334a2414af2c581d4833363df8441ed01018,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:2308c7b6c3d0aabbadfc9a06d84d67d2243f27fe8eed740ee96b1ce910203f62,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:9cf0ca292340f1f978603955ef682effbf24316d6e2376b1c89906d84c3f06d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:58f678016d7f6c8fe579abe886fd138ef853642faa6766ca60639feac12d82ac,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:46f92909153aaf03a585374b77d103c536509747e3270558d9a533295c46a7c5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:7fe367f51638c5c302fd3f8e66a31b09cb3b11519a7f72ef142b6c6fe8b91694,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:9ebf424d4107275a2e3f21f7a18ef257ff2f97c1298109ac7c802a5a4f4794f2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:4fcbe0d9a3c845708ecc32102ad4abbcbd947d87e5cf91f186de75b5d84ec681,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:58a4e9a4dea86635c93ce37a2bb3c60ece62b3d656f6ee6a8845347cbb3e90fd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:6f2b843bc9f4ceb1ee873972d69e6bae6e1dbd378b486995bc3697d8bcff6339,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:03b4bb79b71d5ca7792d19c4c0ee08a5e5a407ad844c087305c42dd909ee7490,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:773daada6402d9cad089cdc809d6c0335456d057ac1a25441ab5d82add2f70f4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:7323406a63fb3fdbb3eea4da0f7e8ed89c94c9bd0ad5ecd6c18fa4a4c2c550c4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:7ae82068011e2d2e5ddc88c943fd32ff4a11902793e7a1df729811b2e27122a0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:0c762c15d9d98d39cc9dc3d1f9a70f9188fef58d4e2f3b0c69c896cab8da5e48,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:febf65561eeef5b36b70d0d65ee83f6451e43ec97bfab4d826e14215da6ff19b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:b8aadfc3d547c5ef1e27fcb573d4760cf8c2f2271eefe1793c35a0d46b640837,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:ecc91fd5079ee6d0c6ae1b11e97da790e33864d0e1930e574f959da2bddfa59a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:2e981e93f99c929a3f04e5e41c8f645d44d390a9aeee3c5193cce7ec2edcbf3a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:1e5714637b6e1a24c2858fe6d9bbb3f00bc61d69ad74a657b1c23682bf4cb2b7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:35b8dcf27dc3b67f3840fa0e693ff312f74f7e22c634dff206a5c4d0133c716c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:e109e4863e05e803dbfe04917756fd52231c560c65353170a2000be6cc2bb53d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:6df0bebd9318ce11624413249e7e9781311638f276f8877668d3b382fe90e62f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:56b75d97f4a48c8cf58b3a7c18c43618efb308bf0188124f6301142e61299b0c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:a51ed62767206067aa501142dbf01f20b3d65325d30faf1b4d6424d5b17dfba5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:592e3cd32d3cc97a69093ad905b449aa374ffbb1b2644b738bb6c1434476d1f6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:9596452e283febbe08204d0ef0fd1992af3395d0969f7ac76663ed7c8be5b4d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:d61005a10bef1b37762a8a41e6755c1169241e36cc5f92886bca6f4f6b9c381a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:e6a4335bcbeed3cd3e73ac879f754e314761e4a417a67539ca88e96a79346328,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:97d88fc53421b699fc91983313d7beec4a0f177089e95bdf5ba15c3f521db9a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:5365e5c9c3ad2ede1b6945255b2cc6b009d642c39babdf25e0655282cfa646fe,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:5b55795d774e0ea160ff8a7fd491ed41cf2d93c7d821694abb3a879eaffcefeb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:26e955c46a6063eafcfeb79430bf3d9268dbe95687c00e63a624b3ec5a846f5a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:58939baa18ab09e2b24996c5f3665ae52274b781f661ea06a67c991e9a832d5a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:b8bff6857fec93c3c1521f1a8c23de21bcb86fc0f960972e81f6c3f95d4185be,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:943eee724277e252795909137538a553ef5284c8103ad01b9be7b0138c66d14d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:d97b08fd421065c8c33a523973822ac468500cbe853069aa9214393fbda7a908,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:289dea3beea1cd4405895fc42e44372b35e4a941e31c59e102c333471a3ca9b7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:9b19894fa67a81bf8ba4159b55b49f38877c670aeb97e2021c341cef2a9294e4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:ea164961ad30453ad0301c6b73364e1f1024f689634c88dd98265f9c7048e31d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:6f9f2ea45f0271f6da8eb05a5f74cf5ce6769479346f5c2f407ee6f31a9c7ff3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:59448516174fc3bab679b9a8dd62cb9a9d16b5734aadbeb98e960e3b7c79bd22,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:adcdeb8ecd601fb03c3b0901d5b5111af2ca48f7dd443e22224db6daaf08f5d0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:2bf32d9b95899d7637dfe19d07cf1ecc9a06593984faff57a3c0dce060012edb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:7a452cd18b64d522e8a1e25bdcea543e9fe5f5b76e1c5e044c2b5334e06a326b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:6a46aa13aa359b8e782a22d67db42db02bbf2bb7e35df4b684ac1daeda38cde3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:f6824854bea6b2acbb00c34639799b4744818d4adbdd40e37dc5088f9ae18d58,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:a66d2fdc21f25c690f02e643d2666dbe7df43a64cd55086ec33d6755e6d809b9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:ad5cec8b914687f3b378754f76bd30ade09c1b33a5638816b64fee68ebe2ab45,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:d04912a69e111cb6ca00b5019cdc2ebc43b89e5fc090260718752184a2581072,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:f9a32c333aae6ef5bddd7ba613c17d42207d290e58c079b80235621fe2cd626c,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjgzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-84c868ff4cfg79v_openstack-operators(d99c70c9-9474-4418-8030-df6d871283e7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.555070 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m"] Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.560958 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv"] Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.563681 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhkjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5f67fbc655-2knl5_openstack-operators(de2d002c-ff31-4c5b-aaa6-9e19c00caf6c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.567403 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-llbl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-67cfc6749b-mhkk2_openstack-operators(a0ce9c8b-219c-40da-bc5b-b171446c36ba): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.567535 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ppw7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-7f554bff7b-26jhx_openstack-operators(6913b400-809d-4aa7-b478-999c34cdf0da): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.587011 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g79hp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-68b6c87b68-cbf4m_openstack-operators(20e2b6da-45d9-40d7-8a93-05cc865543c6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.594705 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58sn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5458f77c4-j8cfc_openstack-operators(9744ec37-6d1a-4b31-b443-26ef804824f3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.869204 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" event={"ID":"af04a067-8a30-4d2d-a0ff-b3206375d952","Type":"ContainerStarted","Data":"c009add82fe5c31991c18676b0092d0a402d13d13c6431335f6fd182c58cbcc8"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.893855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" event={"ID":"6913b400-809d-4aa7-b478-999c34cdf0da","Type":"ContainerStarted","Data":"0cba0c110085e2d529fe982f2b7935fb97bd9a0e9ba16013f475a72dad3fce7a"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.898119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" event={"ID":"33631d8c-63c6-4912-be80-748b6c997cae","Type":"ContainerStarted","Data":"8b0e561029b33745fdae456fddaabbd433050e11cba984d2297206b800d0ebeb"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.902568 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" event={"ID":"38e49d40-f9b0-476b-a875-891fdb26d8fc","Type":"ContainerStarted","Data":"c0d82bf91f4ee7b3adf442108fe9e287ac69bac2944fcb9ad85fa95d5632152f"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.906674 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" event={"ID":"6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b","Type":"ContainerStarted","Data":"29b9800f2dd75a9326ee9069e0a064a3e1f86bcec5dd04e50a6fa082e960169f"} Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.915212 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" podUID="d99c70c9-9474-4418-8030-df6d871283e7" Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.915332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" event={"ID":"9744ec37-6d1a-4b31-b443-26ef804824f3","Type":"ContainerStarted","Data":"b30173088fd0fb1b9833d2f5eef1db36027d9edfa0de4f56cac464d4339a778c"} Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.915467 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" podUID="a0ce9c8b-219c-40da-bc5b-b171446c36ba" Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.924031 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" event={"ID":"579b1489-9552-485c-92da-5386e7b2afeb","Type":"ContainerStarted","Data":"1fe4b3d8d8fdccceb29cfd2a1bea741d2dae27f44f42f3fc6d071d6c8d3e613d"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.927817 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" event={"ID":"20e2b6da-45d9-40d7-8a93-05cc865543c6","Type":"ContainerStarted","Data":"60a00e501961e478261fca8b3235dd52f9468b49a6e84c08bf044cf15fc1aaf1"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.942212 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" event={"ID":"a0ce9c8b-219c-40da-bc5b-b171446c36ba","Type":"ContainerStarted","Data":"49001d9ebdc4721b42acc83c88eb8ba666f7330329f8903dd974647e8d7f8043"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.955562 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" event={"ID":"6f554597-7d00-422a-b570-834795047cf9","Type":"ContainerStarted","Data":"b5c87356058a62a33b5ece0d72b0c9276586b70332efb71d49b45df018510cca"} Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.956226 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" podUID="a0ce9c8b-219c-40da-bc5b-b171446c36ba" Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.959278 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" event={"ID":"9c145bb1-292a-4675-be8c-9bd49d4034f2","Type":"ContainerStarted","Data":"8541b6fec708a8d5c2bf7fadbdb8774e7e7656ee480a3b5a269219ca790151b6"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.972929 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" event={"ID":"7ed8853a-5e8a-4dce-abb2-73bc7375a2bb","Type":"ContainerStarted","Data":"78a44cc50af66ee1aad52653fbc0f7884de419f6e859417a0c8ed85358fcbdcd"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.986294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" event={"ID":"d99c70c9-9474-4418-8030-df6d871283e7","Type":"ContainerStarted","Data":"3f229f3987c177f3cec73fc614f700d0aaeeefe49616d735314d2a407d61659c"} Oct 10 07:07:09 crc kubenswrapper[4732]: E1010 07:07:09.988278 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" podUID="d99c70c9-9474-4418-8030-df6d871283e7" Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.991899 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" event={"ID":"0da772cc-f90b-4ee7-8793-2fd804249c91","Type":"ContainerStarted","Data":"f3b3ca1deed393b5c6897ac523273d927fd09c9f5531a7f9c2287de3e82c8e92"} Oct 10 07:07:09 crc kubenswrapper[4732]: I1010 07:07:09.994598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" event={"ID":"ac7e4be6-eb30-4fec-bb28-8f7181d7d337","Type":"ContainerStarted","Data":"40f70106f2def595ad814fac547d9adcddf3b7fba7e823201c2763f261040c29"} Oct 10 07:07:10 crc kubenswrapper[4732]: I1010 07:07:10.031408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" event={"ID":"48a56683-0762-4720-9640-c2b4e9ffb277","Type":"ContainerStarted","Data":"b00240d2a32c4ff1bd3fa682ef08cac9c9f450ac342fcd786d06ad898f30c5d7"} Oct 10 07:07:10 crc kubenswrapper[4732]: I1010 07:07:10.046636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" event={"ID":"455baf3e-9434-4962-93bb-cd6497747fa5","Type":"ContainerStarted","Data":"61af9addef2662e71127b50d745ac3caffb2b2e9c9cac2cb2bab3cb813d1db0b"} Oct 10 07:07:10 crc kubenswrapper[4732]: I1010 07:07:10.053440 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" event={"ID":"de2d002c-ff31-4c5b-aaa6-9e19c00caf6c","Type":"ContainerStarted","Data":"ed5adce3b8b1e2f0e60db8906dedadf82e3cf95216dc0d274df237d4d66909e9"} Oct 10 07:07:10 crc kubenswrapper[4732]: I1010 07:07:10.055491 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" event={"ID":"0f893cf0-2c81-455d-a447-b0745e767b18","Type":"ContainerStarted","Data":"731500fa59999ed6779553aad1699e335faad98beb43bf4b24de4947f3a94381"} Oct 10 07:07:10 crc kubenswrapper[4732]: I1010 07:07:10.063162 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" event={"ID":"26800fd4-b33e-4bb8-815b-5ec03fc9b22b","Type":"ContainerStarted","Data":"39b439efd216fea6c6f93d5f3099f6b4911f2efb2cd50950dc38e271f70b413d"} Oct 10 07:07:10 crc kubenswrapper[4732]: E1010 07:07:10.419568 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" podUID="6913b400-809d-4aa7-b478-999c34cdf0da" Oct 10 07:07:10 crc kubenswrapper[4732]: E1010 07:07:10.457042 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" podUID="9744ec37-6d1a-4b31-b443-26ef804824f3" Oct 10 07:07:10 crc kubenswrapper[4732]: E1010 07:07:10.539892 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" podUID="de2d002c-ff31-4c5b-aaa6-9e19c00caf6c" Oct 10 07:07:10 crc kubenswrapper[4732]: E1010 07:07:10.610295 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" podUID="20e2b6da-45d9-40d7-8a93-05cc865543c6" Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.086473 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" event={"ID":"6913b400-809d-4aa7-b478-999c34cdf0da","Type":"ContainerStarted","Data":"5adb02c870738625b8b2462cd51f11781d39af4453325a375ec4a21d0747d896"} Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.093902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" event={"ID":"6f554597-7d00-422a-b570-834795047cf9","Type":"ContainerStarted","Data":"5f54a3e5f2813c4fb0a000d594cf0d50bf824752c23c2eadead7513dadecbdf7"} Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.093960 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" event={"ID":"6f554597-7d00-422a-b570-834795047cf9","Type":"ContainerStarted","Data":"fcf92c444701ea05c314f301b3593ea4b46140cbb69085a1dcd846c473ea0646"} Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.094651 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:11 crc kubenswrapper[4732]: E1010 07:07:11.095226 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" podUID="6913b400-809d-4aa7-b478-999c34cdf0da" Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.101171 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" event={"ID":"9744ec37-6d1a-4b31-b443-26ef804824f3","Type":"ContainerStarted","Data":"3d462a9957583237939f96ec0c527cbc1043e329f7b91b188dee4df3560e2494"} Oct 10 07:07:11 crc kubenswrapper[4732]: E1010 07:07:11.102756 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" podUID="9744ec37-6d1a-4b31-b443-26ef804824f3" Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.113650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" event={"ID":"de2d002c-ff31-4c5b-aaa6-9e19c00caf6c","Type":"ContainerStarted","Data":"6513f138904b03b44e767ff6d456ffb127f95339375d966d3360ce5e877d2ff6"} Oct 10 07:07:11 crc kubenswrapper[4732]: E1010 07:07:11.117055 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" podUID="de2d002c-ff31-4c5b-aaa6-9e19c00caf6c" Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.148290 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" event={"ID":"d99c70c9-9474-4418-8030-df6d871283e7","Type":"ContainerStarted","Data":"422c5339bd25af0823710325fa709ceb8f99281894e1dd07dcaf2e15f525812e"} Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.161613 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" event={"ID":"20e2b6da-45d9-40d7-8a93-05cc865543c6","Type":"ContainerStarted","Data":"01dca1adcee263ddeae965fe8c5ea51251d989319f902c652c753b3381916969"} Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.166659 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" event={"ID":"a0ce9c8b-219c-40da-bc5b-b171446c36ba","Type":"ContainerStarted","Data":"581001bb67d8450c338a4ab7d8bb3a2f9f2690e9296ac70999b4b3476014d7a2"} Oct 10 07:07:11 crc kubenswrapper[4732]: E1010 07:07:11.167479 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" podUID="20e2b6da-45d9-40d7-8a93-05cc865543c6" Oct 10 07:07:11 crc kubenswrapper[4732]: E1010 07:07:11.167685 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" podUID="d99c70c9-9474-4418-8030-df6d871283e7" Oct 10 07:07:11 crc kubenswrapper[4732]: E1010 07:07:11.181239 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" podUID="a0ce9c8b-219c-40da-bc5b-b171446c36ba" Oct 10 07:07:11 crc kubenswrapper[4732]: I1010 07:07:11.206034 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" podStartSLOduration=4.206010779 podStartE2EDuration="4.206010779s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:07:11.205175316 +0000 UTC m=+958.274766557" watchObservedRunningTime="2025-10-10 07:07:11.206010779 +0000 UTC m=+958.275602020" Oct 10 07:07:12 crc kubenswrapper[4732]: E1010 07:07:12.200592 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" podUID="a0ce9c8b-219c-40da-bc5b-b171446c36ba" Oct 10 07:07:12 crc kubenswrapper[4732]: E1010 07:07:12.200633 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" podUID="de2d002c-ff31-4c5b-aaa6-9e19c00caf6c" Oct 10 07:07:12 crc kubenswrapper[4732]: E1010 07:07:12.202143 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" podUID="9744ec37-6d1a-4b31-b443-26ef804824f3" Oct 10 07:07:12 crc kubenswrapper[4732]: E1010 07:07:12.202194 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:98a5233f0596591acdf2c6a5838b08be108787cdb6ad1995b2b7886bac0fe6ca\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" podUID="6913b400-809d-4aa7-b478-999c34cdf0da" Oct 10 07:07:12 crc kubenswrapper[4732]: E1010 07:07:12.202290 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:a17fc270857869fd1efe5020b2a1cb8c2abbd838f08de88f3a6a59e8754ec351\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" podUID="d99c70c9-9474-4418-8030-df6d871283e7" Oct 10 07:07:12 crc kubenswrapper[4732]: E1010 07:07:12.203225 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d33c1f507e1f5b9a4bf226ad98917e92101ac66b36e19d35cbe04ae7014f6bff\\\"\"" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" podUID="20e2b6da-45d9-40d7-8a93-05cc865543c6" Oct 10 07:07:18 crc kubenswrapper[4732]: I1010 07:07:18.821005 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5698bb9464-8qpcv" Oct 10 07:07:21 crc kubenswrapper[4732]: I1010 07:07:21.259822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" event={"ID":"0da772cc-f90b-4ee7-8793-2fd804249c91","Type":"ContainerStarted","Data":"7b584acaaed4ed452d2a3e550194201dcb03c351b1db45ab8997c72223c436f0"} Oct 10 07:07:21 crc kubenswrapper[4732]: I1010 07:07:21.261439 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" event={"ID":"579b1489-9552-485c-92da-5386e7b2afeb","Type":"ContainerStarted","Data":"6710033f4f907b00778fb3e86a05c9570183369e8cc8a72d5a6ea3ea2cbd681b"} Oct 10 07:07:21 crc kubenswrapper[4732]: I1010 07:07:21.262873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" event={"ID":"38e49d40-f9b0-476b-a875-891fdb26d8fc","Type":"ContainerStarted","Data":"eef8e97cb38a67fd2bc770e634caeb32bcebde4bd95e2f7b402817afdaa587f7"} Oct 10 07:07:21 crc kubenswrapper[4732]: I1010 07:07:21.264516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" event={"ID":"455baf3e-9434-4962-93bb-cd6497747fa5","Type":"ContainerStarted","Data":"30401b5fa34bd046c76ff17f1b992f750efd9848d65e5a9f31e8f5637061bb71"} Oct 10 07:07:21 crc kubenswrapper[4732]: I1010 07:07:21.265865 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" event={"ID":"eb7804d7-814a-4aeb-b9d5-b359cada4441","Type":"ContainerStarted","Data":"6d9cc70aa6ab3fd89ff7524b6acac88f1d342b36564189766602613a17ded0c0"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.293434 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" event={"ID":"0f893cf0-2c81-455d-a447-b0745e767b18","Type":"ContainerStarted","Data":"c48730e86762410e77aff36812bf5d325039322f3c4ceb654365c6f32ebcf283"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.293487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" event={"ID":"0f893cf0-2c81-455d-a447-b0745e767b18","Type":"ContainerStarted","Data":"dc8020170085461b4a970d287eeeb70eac99abb3b8ed46f49c27e78ba2261bd3"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.293615 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.303211 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" event={"ID":"48a56683-0762-4720-9640-c2b4e9ffb277","Type":"ContainerStarted","Data":"d5f6c6bdaa060c617d0137a2e4c5638138090732059fa5af2615bf2a589405fd"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.317293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" event={"ID":"6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b","Type":"ContainerStarted","Data":"0a46e9e798c9fe10968b416bce6ae9b74c7bc81308e0e80ee2f4759f1d7d6c7e"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.317347 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" event={"ID":"6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b","Type":"ContainerStarted","Data":"8c7254958c7d34fac1d682452382b03966ef4328369148192b9828a4ae991a8c"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.317420 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.326092 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" event={"ID":"9c145bb1-292a-4675-be8c-9bd49d4034f2","Type":"ContainerStarted","Data":"d6dc6caad7d908f96976372e175defa6c2bbfd62198eb428d91faeff11fe88e2"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.333762 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" podStartSLOduration=3.529097066 podStartE2EDuration="15.33374485s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.981766983 +0000 UTC m=+956.051358224" lastFinishedPulling="2025-10-10 07:07:20.786414767 +0000 UTC m=+967.856006008" observedRunningTime="2025-10-10 07:07:22.327937202 +0000 UTC m=+969.397528453" watchObservedRunningTime="2025-10-10 07:07:22.33374485 +0000 UTC m=+969.403336091" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.338735 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" event={"ID":"edbb64bd-a7b0-40ea-90e2-7cc1fee46f76","Type":"ContainerStarted","Data":"74179f52462ee462b7afb4eb3caecd0f64070639c642beff3cb1aa74ec0da6d6"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.349758 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" event={"ID":"eb7804d7-814a-4aeb-b9d5-b359cada4441","Type":"ContainerStarted","Data":"c3a27a76169eb3cb8976a495824295c6c59872d6decdd608eb4f93f8e0603cf6"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.349847 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.365294 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" podStartSLOduration=3.593249851 podStartE2EDuration="15.365266658s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.013403853 +0000 UTC m=+956.082995094" lastFinishedPulling="2025-10-10 07:07:20.78542066 +0000 UTC m=+967.855011901" observedRunningTime="2025-10-10 07:07:22.349125649 +0000 UTC m=+969.418716900" watchObservedRunningTime="2025-10-10 07:07:22.365266658 +0000 UTC m=+969.434857899" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.375583 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" event={"ID":"0da772cc-f90b-4ee7-8793-2fd804249c91","Type":"ContainerStarted","Data":"962063e3b1d53b1c1bb6f5e7b8d89a01d84db223b008a3b9276024d41654a126"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.375703 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.387627 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" podStartSLOduration=3.845021967 podStartE2EDuration="16.387606676s" podCreationTimestamp="2025-10-10 07:07:06 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.196244584 +0000 UTC m=+955.265835825" lastFinishedPulling="2025-10-10 07:07:20.738829293 +0000 UTC m=+967.808420534" observedRunningTime="2025-10-10 07:07:22.378604161 +0000 UTC m=+969.448195412" watchObservedRunningTime="2025-10-10 07:07:22.387606676 +0000 UTC m=+969.457197917" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.387807 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" event={"ID":"ac7e4be6-eb30-4fec-bb28-8f7181d7d337","Type":"ContainerStarted","Data":"a30ada354547994e8d0ded4c486713158cc1c97b2f2f3d271d0a4bce92dc8dfa"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.402902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" event={"ID":"33631d8c-63c6-4912-be80-748b6c997cae","Type":"ContainerStarted","Data":"639ff7c99df2ed8dea2bb06051bde47f564834c2d9270b06e045a1f2ad007991"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.410238 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" event={"ID":"26800fd4-b33e-4bb8-815b-5ec03fc9b22b","Type":"ContainerStarted","Data":"914fe0747103219ab5dc00c474edad8bc44549cab36b03e458ac7ade0f6ade6b"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.412842 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" podStartSLOduration=4.143037338 podStartE2EDuration="15.412824582s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.518618067 +0000 UTC m=+956.588209308" lastFinishedPulling="2025-10-10 07:07:20.788405311 +0000 UTC m=+967.857996552" observedRunningTime="2025-10-10 07:07:22.407082085 +0000 UTC m=+969.476673336" watchObservedRunningTime="2025-10-10 07:07:22.412824582 +0000 UTC m=+969.482415823" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.413243 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" event={"ID":"455baf3e-9434-4962-93bb-cd6497747fa5","Type":"ContainerStarted","Data":"3dc154fae98552bc7c855c3118fb0c4b40229496341855b079eb21ee5614c807"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.413965 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.416516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" event={"ID":"ebf07c0d-be2b-41ec-8363-bdfcc2d3802a","Type":"ContainerStarted","Data":"acba04eaf3fecc293fb4978e1efde052ebb9eb2f9eb2edbb5187ab4d7b7630e4"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.416558 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" event={"ID":"ebf07c0d-be2b-41ec-8363-bdfcc2d3802a","Type":"ContainerStarted","Data":"3fa5183cebce424ebe02126c6ac0879b2f6c56085e37c11bbcb64868589657bd"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.417220 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.442944 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" event={"ID":"7ed8853a-5e8a-4dce-abb2-73bc7375a2bb","Type":"ContainerStarted","Data":"f76bbc52e4d7160293346044738983ee6f83432bdb2256544462ce5e49b96605"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.455035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" event={"ID":"af04a067-8a30-4d2d-a0ff-b3206375d952","Type":"ContainerStarted","Data":"cee26ecfe2f62862761ce4b50d0f061dac75c77b9a595843383ef1c096da2f7a"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.456230 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" event={"ID":"579b1489-9552-485c-92da-5386e7b2afeb","Type":"ContainerStarted","Data":"6090e0db5cf5502db507ad779a9740752c030e3069acf3ef32d812d5a3bf85b1"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.456913 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.458532 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" event={"ID":"38e49d40-f9b0-476b-a875-891fdb26d8fc","Type":"ContainerStarted","Data":"6826ed926aa9cb03ef66faf95243c9316874a122c8a69049ae96bce27ca944ad"} Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.458941 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.479047 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" podStartSLOduration=4.084014709 podStartE2EDuration="16.479031353s" podCreationTimestamp="2025-10-10 07:07:06 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.388380471 +0000 UTC m=+955.457971712" lastFinishedPulling="2025-10-10 07:07:20.783397115 +0000 UTC m=+967.852988356" observedRunningTime="2025-10-10 07:07:22.47894749 +0000 UTC m=+969.548538741" watchObservedRunningTime="2025-10-10 07:07:22.479031353 +0000 UTC m=+969.548622594" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.481251 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf" podStartSLOduration=4.237312362 podStartE2EDuration="15.481244413s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.52388675 +0000 UTC m=+956.593477991" lastFinishedPulling="2025-10-10 07:07:20.767818801 +0000 UTC m=+967.837410042" observedRunningTime="2025-10-10 07:07:22.443527927 +0000 UTC m=+969.513119168" watchObservedRunningTime="2025-10-10 07:07:22.481244413 +0000 UTC m=+969.550835654" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.540723 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" podStartSLOduration=4.833784384 podStartE2EDuration="16.5407044s" podCreationTimestamp="2025-10-10 07:07:06 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.076311845 +0000 UTC m=+956.145903086" lastFinishedPulling="2025-10-10 07:07:20.783231861 +0000 UTC m=+967.852823102" observedRunningTime="2025-10-10 07:07:22.506794548 +0000 UTC m=+969.576385809" watchObservedRunningTime="2025-10-10 07:07:22.5407044 +0000 UTC m=+969.610295641" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.542747 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" podStartSLOduration=4.754550989 podStartE2EDuration="16.542740146s" podCreationTimestamp="2025-10-10 07:07:06 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.984764254 +0000 UTC m=+956.054355495" lastFinishedPulling="2025-10-10 07:07:20.772953401 +0000 UTC m=+967.842544652" observedRunningTime="2025-10-10 07:07:22.539398465 +0000 UTC m=+969.608989706" watchObservedRunningTime="2025-10-10 07:07:22.542740146 +0000 UTC m=+969.612331387" Oct 10 07:07:22 crc kubenswrapper[4732]: I1010 07:07:22.566210 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" podStartSLOduration=3.848219887 podStartE2EDuration="15.566193224s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.065259414 +0000 UTC m=+956.134850655" lastFinishedPulling="2025-10-10 07:07:20.783232751 +0000 UTC m=+967.852823992" observedRunningTime="2025-10-10 07:07:22.560320934 +0000 UTC m=+969.629912195" watchObservedRunningTime="2025-10-10 07:07:22.566193224 +0000 UTC m=+969.635784455" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.468312 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" event={"ID":"48a56683-0762-4720-9640-c2b4e9ffb277","Type":"ContainerStarted","Data":"2d288f5f008eca285a78dcff1f732ea84e8d33ec1402d9b7db5faa443d32a70b"} Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.468361 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.470949 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" event={"ID":"9c145bb1-292a-4675-be8c-9bd49d4034f2","Type":"ContainerStarted","Data":"adc28f96c65184e222c88bf38222479ffa68d2409d6a2fb0b7d547f88b63f167"} Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.471149 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.475489 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" event={"ID":"edbb64bd-a7b0-40ea-90e2-7cc1fee46f76","Type":"ContainerStarted","Data":"1cdc766891a1f8b40c8e3dd8098bbe66467804e7df98496ee1a74bdb91e80b4b"} Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.475923 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.478400 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" event={"ID":"7ed8853a-5e8a-4dce-abb2-73bc7375a2bb","Type":"ContainerStarted","Data":"314ac471e5e2bb40380adbffaafe5ddbf9cfdb4f3aca61eb934fd82f77c3741c"} Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.478601 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.480303 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" event={"ID":"af04a067-8a30-4d2d-a0ff-b3206375d952","Type":"ContainerStarted","Data":"c5934d641794e79513dd4695f2dc2dc0262349cb55bc43c65105ae53f154cfed"} Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.480441 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.481705 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" event={"ID":"ac7e4be6-eb30-4fec-bb28-8f7181d7d337","Type":"ContainerStarted","Data":"892d9e685d608419e2d514c6c48a8b232d58b63b61ee5b979275773e580ba9d0"} Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.482260 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.483836 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" event={"ID":"33631d8c-63c6-4912-be80-748b6c997cae","Type":"ContainerStarted","Data":"26211fed33e32406ff93f1a6a5cd5473697a917a65f5d686b879b0761049f549"} Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.484507 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.507850 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" podStartSLOduration=4.69452635 podStartE2EDuration="16.50783114s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.97100288 +0000 UTC m=+956.040594121" lastFinishedPulling="2025-10-10 07:07:20.78430766 +0000 UTC m=+967.853898911" observedRunningTime="2025-10-10 07:07:23.489616634 +0000 UTC m=+970.559207875" watchObservedRunningTime="2025-10-10 07:07:23.50783114 +0000 UTC m=+970.577422391" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.511724 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" podStartSLOduration=4.964646154 podStartE2EDuration="17.511714045s" podCreationTimestamp="2025-10-10 07:07:06 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.232033437 +0000 UTC m=+955.301624678" lastFinishedPulling="2025-10-10 07:07:20.779101328 +0000 UTC m=+967.848692569" observedRunningTime="2025-10-10 07:07:23.505367483 +0000 UTC m=+970.574958734" watchObservedRunningTime="2025-10-10 07:07:23.511714045 +0000 UTC m=+970.581305296" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.526391 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" podStartSLOduration=5.817004932 podStartE2EDuration="17.526372664s" podCreationTimestamp="2025-10-10 07:07:06 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.075647457 +0000 UTC m=+956.145238698" lastFinishedPulling="2025-10-10 07:07:20.785015189 +0000 UTC m=+967.854606430" observedRunningTime="2025-10-10 07:07:23.522702444 +0000 UTC m=+970.592293715" watchObservedRunningTime="2025-10-10 07:07:23.526372664 +0000 UTC m=+970.595963905" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.540493 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" podStartSLOduration=4.742377822 podStartE2EDuration="16.540474248s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.998342954 +0000 UTC m=+956.067934195" lastFinishedPulling="2025-10-10 07:07:20.79643938 +0000 UTC m=+967.866030621" observedRunningTime="2025-10-10 07:07:23.536166041 +0000 UTC m=+970.605757302" watchObservedRunningTime="2025-10-10 07:07:23.540474248 +0000 UTC m=+970.610065489" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.558667 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" podStartSLOduration=5.350155865 podStartE2EDuration="16.558646072s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.586540365 +0000 UTC m=+956.656131606" lastFinishedPulling="2025-10-10 07:07:20.795030572 +0000 UTC m=+967.864621813" observedRunningTime="2025-10-10 07:07:23.551895288 +0000 UTC m=+970.621486529" watchObservedRunningTime="2025-10-10 07:07:23.558646072 +0000 UTC m=+970.628237303" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.574125 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" podStartSLOduration=4.784877938 podStartE2EDuration="16.574107233s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:08.983767507 +0000 UTC m=+956.053358748" lastFinishedPulling="2025-10-10 07:07:20.772996802 +0000 UTC m=+967.842588043" observedRunningTime="2025-10-10 07:07:23.572777846 +0000 UTC m=+970.642369097" watchObservedRunningTime="2025-10-10 07:07:23.574107233 +0000 UTC m=+970.643698474" Oct 10 07:07:23 crc kubenswrapper[4732]: I1010 07:07:23.591372 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" podStartSLOduration=4.831610759 podStartE2EDuration="16.591355612s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.012988192 +0000 UTC m=+956.082579433" lastFinishedPulling="2025-10-10 07:07:20.772733045 +0000 UTC m=+967.842324286" observedRunningTime="2025-10-10 07:07:23.587331362 +0000 UTC m=+970.656922623" watchObservedRunningTime="2025-10-10 07:07:23.591355612 +0000 UTC m=+970.660946853" Oct 10 07:07:24 crc kubenswrapper[4732]: I1010 07:07:24.496357 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" event={"ID":"20e2b6da-45d9-40d7-8a93-05cc865543c6","Type":"ContainerStarted","Data":"5ab1f360fbb438772f828ba59741051784dc8ae730209fdb814d0008d1452583"} Oct 10 07:07:24 crc kubenswrapper[4732]: I1010 07:07:24.521088 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" podStartSLOduration=2.759998924 podStartE2EDuration="17.521068582s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.586898885 +0000 UTC m=+956.656490126" lastFinishedPulling="2025-10-10 07:07:24.347968543 +0000 UTC m=+971.417559784" observedRunningTime="2025-10-10 07:07:24.517572297 +0000 UTC m=+971.587163548" watchObservedRunningTime="2025-10-10 07:07:24.521068582 +0000 UTC m=+971.590659833" Oct 10 07:07:26 crc kubenswrapper[4732]: I1010 07:07:26.515864 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" event={"ID":"de2d002c-ff31-4c5b-aaa6-9e19c00caf6c","Type":"ContainerStarted","Data":"410ac9b2509a30f18098798a20707bb0806ee1634bf602e6c7a4668d59fd1b57"} Oct 10 07:07:26 crc kubenswrapper[4732]: I1010 07:07:26.516375 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" Oct 10 07:07:26 crc kubenswrapper[4732]: I1010 07:07:26.535443 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" podStartSLOduration=3.581568077 podStartE2EDuration="19.535422509s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.56357147 +0000 UTC m=+956.633162711" lastFinishedPulling="2025-10-10 07:07:25.517425902 +0000 UTC m=+972.587017143" observedRunningTime="2025-10-10 07:07:26.52954687 +0000 UTC m=+973.599138141" watchObservedRunningTime="2025-10-10 07:07:26.535422509 +0000 UTC m=+973.605013750" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.239736 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-658bdf4b74-b54wh" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.253333 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7b7fb68549-xt4c5" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.339378 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7ffbcb7588-s8qhg" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.478540 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-9c5c78d49-dpw4c" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.506979 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55b6b7c7b8-zz2vz" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.545395 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f9fb45f8f-42jm6" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.555211 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-79d585cb66-4hzzt" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.575352 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5df598886f-s5qjg" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.586191 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-85d5d9dd78-pbxl8" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.593234 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84b9b84486-9zhd2" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.627433 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-858f76bbdd-gvtth" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.683590 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69fdcfc5f5-mx78z" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.752756 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-79df5fb58c-sl8ht" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.792625 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" Oct 10 07:07:27 crc kubenswrapper[4732]: I1010 07:07:27.885653 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-db6d7f97b-8zx8p" Oct 10 07:07:28 crc kubenswrapper[4732]: I1010 07:07:28.070920 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-656bcbd775-nztbg" Oct 10 07:07:28 crc kubenswrapper[4732]: I1010 07:07:28.536419 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" event={"ID":"9744ec37-6d1a-4b31-b443-26ef804824f3","Type":"ContainerStarted","Data":"edea94293ff90ffff619ae8b77477df1cbd5ce45466adf9cdb9b95c470bbb5ab"} Oct 10 07:07:28 crc kubenswrapper[4732]: I1010 07:07:28.537027 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" Oct 10 07:07:28 crc kubenswrapper[4732]: I1010 07:07:28.538330 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" event={"ID":"6913b400-809d-4aa7-b478-999c34cdf0da","Type":"ContainerStarted","Data":"1daaca18929ae112a6bd4e838232f25d164f0e5b79e20899b862b2577d557bc7"} Oct 10 07:07:28 crc kubenswrapper[4732]: I1010 07:07:28.538509 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" Oct 10 07:07:28 crc kubenswrapper[4732]: I1010 07:07:28.560529 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" podStartSLOduration=3.805440153 podStartE2EDuration="21.560508828s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.594576783 +0000 UTC m=+956.664168024" lastFinishedPulling="2025-10-10 07:07:27.349645458 +0000 UTC m=+974.419236699" observedRunningTime="2025-10-10 07:07:28.556732786 +0000 UTC m=+975.626324047" watchObservedRunningTime="2025-10-10 07:07:28.560508828 +0000 UTC m=+975.630100069" Oct 10 07:07:28 crc kubenswrapper[4732]: I1010 07:07:28.576254 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" podStartSLOduration=3.777500863 podStartE2EDuration="21.576232133s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.567478096 +0000 UTC m=+956.637069337" lastFinishedPulling="2025-10-10 07:07:27.366209366 +0000 UTC m=+974.435800607" observedRunningTime="2025-10-10 07:07:28.570669512 +0000 UTC m=+975.640260763" watchObservedRunningTime="2025-10-10 07:07:28.576232133 +0000 UTC m=+975.645823374" Oct 10 07:07:30 crc kubenswrapper[4732]: I1010 07:07:30.554962 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" event={"ID":"d99c70c9-9474-4418-8030-df6d871283e7","Type":"ContainerStarted","Data":"af69e1ad3f2f2d4e8c185570237d95305ca0411967fd91c36a7f692e9950244b"} Oct 10 07:07:30 crc kubenswrapper[4732]: I1010 07:07:30.555496 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:30 crc kubenswrapper[4732]: I1010 07:07:30.557350 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" event={"ID":"a0ce9c8b-219c-40da-bc5b-b171446c36ba","Type":"ContainerStarted","Data":"8868b4904a515cef7647c69942d89d4227c196a0e8c3f238fe28ba671c6e3b7c"} Oct 10 07:07:30 crc kubenswrapper[4732]: I1010 07:07:30.557549 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" Oct 10 07:07:30 crc kubenswrapper[4732]: I1010 07:07:30.585613 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" podStartSLOduration=3.710857998 podStartE2EDuration="23.585594436s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.547679958 +0000 UTC m=+956.617271199" lastFinishedPulling="2025-10-10 07:07:29.422416386 +0000 UTC m=+976.492007637" observedRunningTime="2025-10-10 07:07:30.582109472 +0000 UTC m=+977.651700733" watchObservedRunningTime="2025-10-10 07:07:30.585594436 +0000 UTC m=+977.655185677" Oct 10 07:07:37 crc kubenswrapper[4732]: I1010 07:07:37.796300 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-68b6c87b68-cbf4m" Oct 10 07:07:37 crc kubenswrapper[4732]: I1010 07:07:37.812460 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" podStartSLOduration=11.497001363 podStartE2EDuration="30.81243416s" podCreationTimestamp="2025-10-10 07:07:07 +0000 UTC" firstStartedPulling="2025-10-10 07:07:09.56689879 +0000 UTC m=+956.636490031" lastFinishedPulling="2025-10-10 07:07:28.882331587 +0000 UTC m=+975.951922828" observedRunningTime="2025-10-10 07:07:30.612923895 +0000 UTC m=+977.682515146" watchObservedRunningTime="2025-10-10 07:07:37.81243416 +0000 UTC m=+984.882025431" Oct 10 07:07:37 crc kubenswrapper[4732]: I1010 07:07:37.818646 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5f67fbc655-2knl5" Oct 10 07:07:37 crc kubenswrapper[4732]: I1010 07:07:37.851828 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-67cfc6749b-mhkk2" Oct 10 07:07:37 crc kubenswrapper[4732]: I1010 07:07:37.973291 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5458f77c4-j8cfc" Oct 10 07:07:38 crc kubenswrapper[4732]: I1010 07:07:38.067215 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-7f554bff7b-26jhx" Oct 10 07:07:38 crc kubenswrapper[4732]: I1010 07:07:38.328857 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84c868ff4cfg79v" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.211130 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-bh92k"] Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.216943 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.223103 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.223374 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-6l7gq" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.223499 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.227118 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-bh92k"] Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.228299 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.296302 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8168f69a-05b1-46d0-b5df-40c830f02a0c-config\") pod \"dnsmasq-dns-7bfcb9d745-bh92k\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.296746 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzsb5\" (UniqueName: \"kubernetes.io/projected/8168f69a-05b1-46d0-b5df-40c830f02a0c-kube-api-access-jzsb5\") pod \"dnsmasq-dns-7bfcb9d745-bh92k\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.303583 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-jv6g7"] Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.304735 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.306315 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.323743 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-jv6g7"] Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.397658 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzsb5\" (UniqueName: \"kubernetes.io/projected/8168f69a-05b1-46d0-b5df-40c830f02a0c-kube-api-access-jzsb5\") pod \"dnsmasq-dns-7bfcb9d745-bh92k\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.397955 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8168f69a-05b1-46d0-b5df-40c830f02a0c-config\") pod \"dnsmasq-dns-7bfcb9d745-bh92k\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.398955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8168f69a-05b1-46d0-b5df-40c830f02a0c-config\") pod \"dnsmasq-dns-7bfcb9d745-bh92k\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.432570 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzsb5\" (UniqueName: \"kubernetes.io/projected/8168f69a-05b1-46d0-b5df-40c830f02a0c-kube-api-access-jzsb5\") pod \"dnsmasq-dns-7bfcb9d745-bh92k\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.499083 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/251bdaef-c7c6-4efe-af9a-c87cacb645c3-kube-api-access-n72nb\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.499220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-config\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.499262 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-dns-svc\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.543234 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.600424 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/251bdaef-c7c6-4efe-af9a-c87cacb645c3-kube-api-access-n72nb\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.600519 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-config\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.600551 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-dns-svc\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.601455 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-dns-svc\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.601510 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-config\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.621439 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/251bdaef-c7c6-4efe-af9a-c87cacb645c3-kube-api-access-n72nb\") pod \"dnsmasq-dns-758b79db4c-jv6g7\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.917594 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.964684 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-bh92k"] Oct 10 07:07:51 crc kubenswrapper[4732]: W1010 07:07:51.974919 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8168f69a_05b1_46d0_b5df_40c830f02a0c.slice/crio-55f811f6ef4cb3d46b59fa94f0695ee7a052282ab4e401a9f31f43d9a11d2791 WatchSource:0}: Error finding container 55f811f6ef4cb3d46b59fa94f0695ee7a052282ab4e401a9f31f43d9a11d2791: Status 404 returned error can't find the container with id 55f811f6ef4cb3d46b59fa94f0695ee7a052282ab4e401a9f31f43d9a11d2791 Oct 10 07:07:51 crc kubenswrapper[4732]: I1010 07:07:51.977149 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:07:52 crc kubenswrapper[4732]: I1010 07:07:52.335461 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-jv6g7"] Oct 10 07:07:52 crc kubenswrapper[4732]: W1010 07:07:52.340339 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod251bdaef_c7c6_4efe_af9a_c87cacb645c3.slice/crio-5b8f0d78ccf4a3b1a5086d28537d0646d241e827ef3a2f347b9263ff41c50b37 WatchSource:0}: Error finding container 5b8f0d78ccf4a3b1a5086d28537d0646d241e827ef3a2f347b9263ff41c50b37: Status 404 returned error can't find the container with id 5b8f0d78ccf4a3b1a5086d28537d0646d241e827ef3a2f347b9263ff41c50b37 Oct 10 07:07:52 crc kubenswrapper[4732]: I1010 07:07:52.729257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" event={"ID":"8168f69a-05b1-46d0-b5df-40c830f02a0c","Type":"ContainerStarted","Data":"55f811f6ef4cb3d46b59fa94f0695ee7a052282ab4e401a9f31f43d9a11d2791"} Oct 10 07:07:52 crc kubenswrapper[4732]: I1010 07:07:52.730525 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" event={"ID":"251bdaef-c7c6-4efe-af9a-c87cacb645c3","Type":"ContainerStarted","Data":"5b8f0d78ccf4a3b1a5086d28537d0646d241e827ef3a2f347b9263ff41c50b37"} Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.049366 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-jv6g7"] Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.079556 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644597f84c-k4q5x"] Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.084523 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.090859 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-k4q5x"] Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.252504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-dns-svc\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.252594 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-config\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.252647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnmx\" (UniqueName: \"kubernetes.io/projected/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-kube-api-access-rjnmx\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.355497 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnmx\" (UniqueName: \"kubernetes.io/projected/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-kube-api-access-rjnmx\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.355598 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-dns-svc\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.355665 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-config\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.356807 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-dns-svc\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.356825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-config\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.373032 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-bh92k"] Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.393867 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnmx\" (UniqueName: \"kubernetes.io/projected/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-kube-api-access-rjnmx\") pod \"dnsmasq-dns-644597f84c-k4q5x\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.409247 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.415321 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77597f887-bxpbg"] Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.417242 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.437323 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-bxpbg"] Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.456643 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-dns-svc\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.456733 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-config\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.456785 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt24t\" (UniqueName: \"kubernetes.io/projected/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-kube-api-access-nt24t\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.557586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-config\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.557660 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt24t\" (UniqueName: \"kubernetes.io/projected/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-kube-api-access-nt24t\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.557786 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-dns-svc\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.558625 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-dns-svc\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.559316 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-config\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.580924 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt24t\" (UniqueName: \"kubernetes.io/projected/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-kube-api-access-nt24t\") pod \"dnsmasq-dns-77597f887-bxpbg\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.770138 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-k4q5x"] Oct 10 07:07:54 crc kubenswrapper[4732]: W1010 07:07:54.783391 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3d19ccd_5bea_4c1b_9ae0_241606d91aa4.slice/crio-a93a879b56edabc21fb6f3ca793103fed86cb4eb8b009125ffc2f0465515c478 WatchSource:0}: Error finding container a93a879b56edabc21fb6f3ca793103fed86cb4eb8b009125ffc2f0465515c478: Status 404 returned error can't find the container with id a93a879b56edabc21fb6f3ca793103fed86cb4eb8b009125ffc2f0465515c478 Oct 10 07:07:54 crc kubenswrapper[4732]: I1010 07:07:54.835540 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.219983 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.225182 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.228797 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.229166 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qkdf9" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.229236 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.230517 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.231308 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.231482 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.231953 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.237124 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.281382 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77597f887-bxpbg"] Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379546 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379605 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379663 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379699 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/565f831c-0da8-4481-8461-8522e0cfa801-pod-info\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379726 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379749 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/565f831c-0da8-4481-8461-8522e0cfa801-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379800 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-server-conf\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8rp\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-kube-api-access-xl8rp\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379848 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.379880 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.480883 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.480966 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/565f831c-0da8-4481-8461-8522e0cfa801-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.480987 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-server-conf\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481008 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8rp\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-kube-api-access-xl8rp\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481026 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481055 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481099 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481116 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481187 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481227 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/565f831c-0da8-4481-8461-8522e0cfa801-pod-info\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.481245 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.482319 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.482599 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.484327 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.484753 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.486207 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.486806 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-server-conf\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.495505 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/565f831c-0da8-4481-8461-8522e0cfa801-pod-info\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.501171 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.503229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.506539 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.507672 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/565f831c-0da8-4481-8461-8522e0cfa801-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.512564 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8rp\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-kube-api-access-xl8rp\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.514302 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.517592 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.518311 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.518629 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lxwjx" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.519061 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.519851 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.520345 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.520509 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.530161 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.541486 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.550462 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684232 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684317 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684373 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684399 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684525 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684595 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684616 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684733 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684854 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684916 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.684951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vthqm\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-kube-api-access-vthqm\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.783644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" event={"ID":"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4","Type":"ContainerStarted","Data":"a93a879b56edabc21fb6f3ca793103fed86cb4eb8b009125ffc2f0465515c478"} Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.785522 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-bxpbg" event={"ID":"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc","Type":"ContainerStarted","Data":"05ad66c36d43d004aa3e47c911da38e7486adb17e6827ea877333e0ae9a3baf6"} Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786080 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786120 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vthqm\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-kube-api-access-vthqm\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786201 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786228 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786276 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786304 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786327 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786365 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786388 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786412 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.786856 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.787721 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.788089 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.788237 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.788742 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.788995 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.795610 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.796430 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.798332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.798856 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.803307 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vthqm\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-kube-api-access-vthqm\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.811986 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:55 crc kubenswrapper[4732]: I1010 07:07:55.949508 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:07:56 crc kubenswrapper[4732]: I1010 07:07:56.081573 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.901752 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.910683 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.910825 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.927976 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.928257 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.928840 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.930079 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.930514 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.931235 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-r4ctf" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.936565 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.937975 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.938423 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-6rmlz" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.940373 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.940825 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.941112 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 10 07:07:57 crc kubenswrapper[4732]: I1010 07:07:57.941306 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031206 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-secrets\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031359 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031391 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs8xt\" (UniqueName: \"kubernetes.io/projected/dbaa5798-1d07-445a-a226-ad48054d3dbc-kube-api-access-xs8xt\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031422 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031460 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031492 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031575 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.031651 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032010 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032059 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032087 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84slz\" (UniqueName: \"kubernetes.io/projected/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kube-api-access-84slz\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032119 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032146 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032201 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032222 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.032345 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133429 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133493 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs8xt\" (UniqueName: \"kubernetes.io/projected/dbaa5798-1d07-445a-a226-ad48054d3dbc-kube-api-access-xs8xt\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133518 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133547 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133584 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133605 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133653 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133669 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133684 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84slz\" (UniqueName: \"kubernetes.io/projected/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kube-api-access-84slz\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133719 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133737 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133758 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133773 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133800 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-secrets\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133846 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.133870 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.134227 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.135705 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.135833 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.136507 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.136573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.137685 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.137928 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.139228 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.139761 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.139842 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.141084 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.141183 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.141350 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.145305 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.146190 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.149765 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-secrets\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.155581 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs8xt\" (UniqueName: \"kubernetes.io/projected/dbaa5798-1d07-445a-a226-ad48054d3dbc-kube-api-access-xs8xt\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.155659 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84slz\" (UniqueName: \"kubernetes.io/projected/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kube-api-access-84slz\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.168189 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.169836 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.255574 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.268730 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.533004 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.534808 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.536665 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.537555 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.538782 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xz42x" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.541755 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.645332 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.645492 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-config-data\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.645568 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-kolla-config\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.645844 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m6mx\" (UniqueName: \"kubernetes.io/projected/ab930cd4-caad-4980-a491-8f6c5abca8bf-kube-api-access-7m6mx\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.645932 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.747111 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m6mx\" (UniqueName: \"kubernetes.io/projected/ab930cd4-caad-4980-a491-8f6c5abca8bf-kube-api-access-7m6mx\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.747151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.747177 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.747224 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-config-data\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.747248 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-kolla-config\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.747892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-kolla-config\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.748504 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-config-data\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.753104 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.753104 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.772218 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m6mx\" (UniqueName: \"kubernetes.io/projected/ab930cd4-caad-4980-a491-8f6c5abca8bf-kube-api-access-7m6mx\") pod \"memcached-0\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " pod="openstack/memcached-0" Oct 10 07:07:58 crc kubenswrapper[4732]: I1010 07:07:58.860040 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 07:07:59 crc kubenswrapper[4732]: I1010 07:07:59.822021 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"565f831c-0da8-4481-8461-8522e0cfa801","Type":"ContainerStarted","Data":"f8c832e67f10479d00bf02a9a6f6bf1974153e25af20ac26b9e5edf78b3a8e27"} Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.269991 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.303900 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.304909 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.312588 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xxrh5" Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.318924 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.374587 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rh89\" (UniqueName: \"kubernetes.io/projected/6bee7b7d-832a-4bd7-8efd-db27adf3664a-kube-api-access-6rh89\") pod \"kube-state-metrics-0\" (UID: \"6bee7b7d-832a-4bd7-8efd-db27adf3664a\") " pod="openstack/kube-state-metrics-0" Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.475892 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rh89\" (UniqueName: \"kubernetes.io/projected/6bee7b7d-832a-4bd7-8efd-db27adf3664a-kube-api-access-6rh89\") pod \"kube-state-metrics-0\" (UID: \"6bee7b7d-832a-4bd7-8efd-db27adf3664a\") " pod="openstack/kube-state-metrics-0" Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.516896 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rh89\" (UniqueName: \"kubernetes.io/projected/6bee7b7d-832a-4bd7-8efd-db27adf3664a-kube-api-access-6rh89\") pod \"kube-state-metrics-0\" (UID: \"6bee7b7d-832a-4bd7-8efd-db27adf3664a\") " pod="openstack/kube-state-metrics-0" Oct 10 07:08:00 crc kubenswrapper[4732]: I1010 07:08:00.628548 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:08:03 crc kubenswrapper[4732]: W1010 07:08:03.408592 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63706a24_ebfd_45ae_96b0_49ab7bd13fdf.slice/crio-a09ab7179a6af806a1367e461c78cdfe29d6a8b06c2347053b2e5cecb6477e47 WatchSource:0}: Error finding container a09ab7179a6af806a1367e461c78cdfe29d6a8b06c2347053b2e5cecb6477e47: Status 404 returned error can't find the container with id a09ab7179a6af806a1367e461c78cdfe29d6a8b06c2347053b2e5cecb6477e47 Oct 10 07:08:03 crc kubenswrapper[4732]: I1010 07:08:03.852494 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"63706a24-ebfd-45ae-96b0-49ab7bd13fdf","Type":"ContainerStarted","Data":"a09ab7179a6af806a1367e461c78cdfe29d6a8b06c2347053b2e5cecb6477e47"} Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.106841 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lzkzk"] Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.108114 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.113715 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.113730 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2gnsh" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.113793 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.115931 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzkzk"] Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.122521 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-n9v88"] Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.129036 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.136327 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-n9v88"] Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256131 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-combined-ca-bundle\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256217 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-scripts\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256287 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-scripts\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256320 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-run\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256340 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-log\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-ovn-controller-tls-certs\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256441 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-log-ovn\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256534 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-lib\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256617 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run-ovn\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256735 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72s7\" (UniqueName: \"kubernetes.io/projected/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-kube-api-access-v72s7\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256882 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9z6\" (UniqueName: \"kubernetes.io/projected/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-kube-api-access-sp9z6\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.256928 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-etc-ovs\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.358828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-run\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.358876 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-log\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.358915 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-ovn-controller-tls-certs\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.358944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-log-ovn\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.358969 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-lib\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.358989 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run-ovn\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359021 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72s7\" (UniqueName: \"kubernetes.io/projected/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-kube-api-access-v72s7\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359044 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359073 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9z6\" (UniqueName: \"kubernetes.io/projected/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-kube-api-access-sp9z6\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359094 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-etc-ovs\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-combined-ca-bundle\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359175 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-scripts\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359198 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-scripts\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359547 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-etc-ovs\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359618 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run-ovn\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359730 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-lib\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.359824 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-log-ovn\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.360160 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-run\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.361530 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-scripts\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.361640 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.361781 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-log\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.365441 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-ovn-controller-tls-certs\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.366403 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-scripts\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.371788 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-combined-ca-bundle\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.378068 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72s7\" (UniqueName: \"kubernetes.io/projected/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-kube-api-access-v72s7\") pod \"ovn-controller-lzkzk\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.382409 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9z6\" (UniqueName: \"kubernetes.io/projected/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-kube-api-access-sp9z6\") pod \"ovn-controller-ovs-n9v88\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.443325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:04 crc kubenswrapper[4732]: I1010 07:08:04.453744 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.808500 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.811318 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.816103 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.819480 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.820753 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.820866 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.820929 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8l99j" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.821082 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882272 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rttrw\" (UniqueName: \"kubernetes.io/projected/64dcf265-8f29-46bc-9b03-40dda51f606b-kube-api-access-rttrw\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882342 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882407 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882461 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882490 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882510 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-config\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882534 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.882682 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.985937 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.986028 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.986065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.986089 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-config\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.986118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.986154 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.986260 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rttrw\" (UniqueName: \"kubernetes.io/projected/64dcf265-8f29-46bc-9b03-40dda51f606b-kube-api-access-rttrw\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.986282 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.987168 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.987429 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.988152 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.988439 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-config\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:05 crc kubenswrapper[4732]: I1010 07:08:05.999904 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.006880 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rttrw\" (UniqueName: \"kubernetes.io/projected/64dcf265-8f29-46bc-9b03-40dda51f606b-kube-api-access-rttrw\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.007255 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.015971 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.020518 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.133831 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.622669 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.624002 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.625830 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.626385 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.626634 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.634297 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.648174 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dx2s8" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.698329 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjfb\" (UniqueName: \"kubernetes.io/projected/ac6dedf8-3428-4444-86f9-4f25c0b916e3-kube-api-access-wnjfb\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.698416 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.698520 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-config\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.698565 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.698620 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.698920 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.699012 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.700473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801475 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-config\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801497 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801525 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801558 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801577 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801647 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801668 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnjfb\" (UniqueName: \"kubernetes.io/projected/ac6dedf8-3428-4444-86f9-4f25c0b916e3-kube-api-access-wnjfb\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.801827 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.803061 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.803116 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-config\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.803306 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.807063 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.807296 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.812301 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.816766 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnjfb\" (UniqueName: \"kubernetes.io/projected/ac6dedf8-3428-4444-86f9-4f25c0b916e3-kube-api-access-wnjfb\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.838079 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:06 crc kubenswrapper[4732]: I1010 07:08:06.961666 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:11 crc kubenswrapper[4732]: E1010 07:08:11.189603 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 10 07:08:11 crc kubenswrapper[4732]: E1010 07:08:11.190175 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rjnmx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-644597f84c-k4q5x_openstack(a3d19ccd-5bea-4c1b-9ae0-241606d91aa4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 07:08:11 crc kubenswrapper[4732]: E1010 07:08:11.191340 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" Oct 10 07:08:11 crc kubenswrapper[4732]: E1010 07:08:11.909753 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df\\\"\"" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" Oct 10 07:08:12 crc kubenswrapper[4732]: E1010 07:08:12.159794 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 10 07:08:12 crc kubenswrapper[4732]: E1010 07:08:12.159944 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jzsb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7bfcb9d745-bh92k_openstack(8168f69a-05b1-46d0-b5df-40c830f02a0c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 07:08:12 crc kubenswrapper[4732]: E1010 07:08:12.161125 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" podUID="8168f69a-05b1-46d0-b5df-40c830f02a0c" Oct 10 07:08:12 crc kubenswrapper[4732]: E1010 07:08:12.170713 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df" Oct 10 07:08:12 crc kubenswrapper[4732]: E1010 07:08:12.170861 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:c4e71b2158fd939dad8b8e705273493051d3023273d23b279f2699dce6db33df,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n72nb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-758b79db4c-jv6g7_openstack(251bdaef-c7c6-4efe-af9a-c87cacb645c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 07:08:12 crc kubenswrapper[4732]: E1010 07:08:12.173812 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" podUID="251bdaef-c7c6-4efe-af9a-c87cacb645c3" Oct 10 07:08:12 crc kubenswrapper[4732]: I1010 07:08:12.466032 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:08:12 crc kubenswrapper[4732]: I1010 07:08:12.529202 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 07:08:12 crc kubenswrapper[4732]: I1010 07:08:12.615624 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.524988 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.590518 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-dns-svc\") pod \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.590959 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/251bdaef-c7c6-4efe-af9a-c87cacb645c3-kube-api-access-n72nb\") pod \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.591052 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-config\") pod \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\" (UID: \"251bdaef-c7c6-4efe-af9a-c87cacb645c3\") " Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.591947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-config" (OuterVolumeSpecName: "config") pod "251bdaef-c7c6-4efe-af9a-c87cacb645c3" (UID: "251bdaef-c7c6-4efe-af9a-c87cacb645c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.592461 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "251bdaef-c7c6-4efe-af9a-c87cacb645c3" (UID: "251bdaef-c7c6-4efe-af9a-c87cacb645c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.596559 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/251bdaef-c7c6-4efe-af9a-c87cacb645c3-kube-api-access-n72nb" (OuterVolumeSpecName: "kube-api-access-n72nb") pod "251bdaef-c7c6-4efe-af9a-c87cacb645c3" (UID: "251bdaef-c7c6-4efe-af9a-c87cacb645c3"). InnerVolumeSpecName "kube-api-access-n72nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.678787 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.692376 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.692403 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n72nb\" (UniqueName: \"kubernetes.io/projected/251bdaef-c7c6-4efe-af9a-c87cacb645c3-kube-api-access-n72nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.692414 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/251bdaef-c7c6-4efe-af9a-c87cacb645c3-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.793426 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8168f69a-05b1-46d0-b5df-40c830f02a0c-config\") pod \"8168f69a-05b1-46d0-b5df-40c830f02a0c\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.793602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzsb5\" (UniqueName: \"kubernetes.io/projected/8168f69a-05b1-46d0-b5df-40c830f02a0c-kube-api-access-jzsb5\") pod \"8168f69a-05b1-46d0-b5df-40c830f02a0c\" (UID: \"8168f69a-05b1-46d0-b5df-40c830f02a0c\") " Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.794219 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8168f69a-05b1-46d0-b5df-40c830f02a0c-config" (OuterVolumeSpecName: "config") pod "8168f69a-05b1-46d0-b5df-40c830f02a0c" (UID: "8168f69a-05b1-46d0-b5df-40c830f02a0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.798520 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8168f69a-05b1-46d0-b5df-40c830f02a0c-kube-api-access-jzsb5" (OuterVolumeSpecName: "kube-api-access-jzsb5") pod "8168f69a-05b1-46d0-b5df-40c830f02a0c" (UID: "8168f69a-05b1-46d0-b5df-40c830f02a0c"). InnerVolumeSpecName "kube-api-access-jzsb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.838178 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzkzk"] Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.868801 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:08:18 crc kubenswrapper[4732]: W1010 07:08:18.880456 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac6dedf8_3428_4444_86f9_4f25c0b916e3.slice/crio-ed32738bd923676863c786b74fb577e6ed95cefb0ac7a429559d826e3a76d685 WatchSource:0}: Error finding container ed32738bd923676863c786b74fb577e6ed95cefb0ac7a429559d826e3a76d685: Status 404 returned error can't find the container with id ed32738bd923676863c786b74fb577e6ed95cefb0ac7a429559d826e3a76d685 Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.895556 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzsb5\" (UniqueName: \"kubernetes.io/projected/8168f69a-05b1-46d0-b5df-40c830f02a0c-kube-api-access-jzsb5\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.895602 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8168f69a-05b1-46d0-b5df-40c830f02a0c-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.946885 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:08:18 crc kubenswrapper[4732]: W1010 07:08:18.950896 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bee7b7d_832a_4bd7_8efd_db27adf3664a.slice/crio-4de17f72e4b640b82a8780dedae2d6bf6bd018bb2d4140b5229025e18125370c WatchSource:0}: Error finding container 4de17f72e4b640b82a8780dedae2d6bf6bd018bb2d4140b5229025e18125370c: Status 404 returned error can't find the container with id 4de17f72e4b640b82a8780dedae2d6bf6bd018bb2d4140b5229025e18125370c Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.976554 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-bxpbg" event={"ID":"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc","Type":"ContainerStarted","Data":"6eb97669ba18b0e56221859adeb3cc5ccb89ebb3324431e1de64879b0eb29c2c"} Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.980657 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbaa5798-1d07-445a-a226-ad48054d3dbc","Type":"ContainerStarted","Data":"26766aa0ea9f17688805b58b326f94a2932565778f9056ca41c585c549f20e5a"} Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.983993 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.984769 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"88a11668-5ab6-4b77-8bb7-ac60140f4bd4","Type":"ContainerStarted","Data":"4627ef1862c57ebe1d5e5358c0926948ebf0f4f721e9cb1259ad69071f5ddd3e"} Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.986731 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ab930cd4-caad-4980-a491-8f6c5abca8bf","Type":"ContainerStarted","Data":"cfdff48c902851a94ff0a491ee578227d5d7ef62d96b874a9d5baa4d09a7e74f"} Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.988043 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac6dedf8-3428-4444-86f9-4f25c0b916e3","Type":"ContainerStarted","Data":"ed32738bd923676863c786b74fb577e6ed95cefb0ac7a429559d826e3a76d685"} Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.990914 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" event={"ID":"251bdaef-c7c6-4efe-af9a-c87cacb645c3","Type":"ContainerDied","Data":"5b8f0d78ccf4a3b1a5086d28537d0646d241e827ef3a2f347b9263ff41c50b37"} Oct 10 07:08:18 crc kubenswrapper[4732]: I1010 07:08:18.991070 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-758b79db4c-jv6g7" Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.015591 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.015614 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bfcb9d745-bh92k" event={"ID":"8168f69a-05b1-46d0-b5df-40c830f02a0c","Type":"ContainerDied","Data":"55f811f6ef4cb3d46b59fa94f0695ee7a052282ab4e401a9f31f43d9a11d2791"} Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.019499 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6bee7b7d-832a-4bd7-8efd-db27adf3664a","Type":"ContainerStarted","Data":"4de17f72e4b640b82a8780dedae2d6bf6bd018bb2d4140b5229025e18125370c"} Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.022079 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk" event={"ID":"b8c3140a-2ab2-44f7-9ddd-73de883c4b65","Type":"ContainerStarted","Data":"23d65ed8a4a4ec316d44a17b7d44876f3ef058d60c18fdcd3723fd327bf7237f"} Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.096947 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-jv6g7"] Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.105294 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-758b79db4c-jv6g7"] Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.115767 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-n9v88"] Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.120238 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-bh92k"] Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.124456 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bfcb9d745-bh92k"] Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.670330 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="251bdaef-c7c6-4efe-af9a-c87cacb645c3" path="/var/lib/kubelet/pods/251bdaef-c7c6-4efe-af9a-c87cacb645c3/volumes" Oct 10 07:08:19 crc kubenswrapper[4732]: I1010 07:08:19.670757 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8168f69a-05b1-46d0-b5df-40c830f02a0c" path="/var/lib/kubelet/pods/8168f69a-05b1-46d0-b5df-40c830f02a0c/volumes" Oct 10 07:08:19 crc kubenswrapper[4732]: W1010 07:08:19.873109 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64dcf265_8f29_46bc_9b03_40dda51f606b.slice/crio-921fc0c1c0b4fd25990384b47a4dc6619c07acc8751b61a5220b329348a3c52a WatchSource:0}: Error finding container 921fc0c1c0b4fd25990384b47a4dc6619c07acc8751b61a5220b329348a3c52a: Status 404 returned error can't find the container with id 921fc0c1c0b4fd25990384b47a4dc6619c07acc8751b61a5220b329348a3c52a Oct 10 07:08:20 crc kubenswrapper[4732]: I1010 07:08:20.033276 4732 generic.go:334] "Generic (PLEG): container finished" podID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerID="6eb97669ba18b0e56221859adeb3cc5ccb89ebb3324431e1de64879b0eb29c2c" exitCode=0 Oct 10 07:08:20 crc kubenswrapper[4732]: I1010 07:08:20.033811 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-bxpbg" event={"ID":"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc","Type":"ContainerDied","Data":"6eb97669ba18b0e56221859adeb3cc5ccb89ebb3324431e1de64879b0eb29c2c"} Oct 10 07:08:20 crc kubenswrapper[4732]: I1010 07:08:20.036004 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"64dcf265-8f29-46bc-9b03-40dda51f606b","Type":"ContainerStarted","Data":"921fc0c1c0b4fd25990384b47a4dc6619c07acc8751b61a5220b329348a3c52a"} Oct 10 07:08:20 crc kubenswrapper[4732]: I1010 07:08:20.039849 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"88a11668-5ab6-4b77-8bb7-ac60140f4bd4","Type":"ContainerStarted","Data":"3ddbabed55e78f709270c00c818e9ba3b1b86ff17c658889d1c920cecadb8ebc"} Oct 10 07:08:20 crc kubenswrapper[4732]: I1010 07:08:20.043060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"565f831c-0da8-4481-8461-8522e0cfa801","Type":"ContainerStarted","Data":"22b06feca3a6572b5d530d56c80f67e3ad45b92fa5b1fd8735418a9965bcc5fe"} Oct 10 07:08:20 crc kubenswrapper[4732]: I1010 07:08:20.044920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9v88" event={"ID":"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030","Type":"ContainerStarted","Data":"159a334e27f4cf5d4f8dc218da8d8aec72a3205eb7d266975ccfc7379aea7813"} Oct 10 07:08:21 crc kubenswrapper[4732]: I1010 07:08:21.054121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"63706a24-ebfd-45ae-96b0-49ab7bd13fdf","Type":"ContainerStarted","Data":"4dd666a68eabadb0f0ffc4673eed0a478b682b215b386838970de85ec6a14574"} Oct 10 07:08:21 crc kubenswrapper[4732]: I1010 07:08:21.058463 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-bxpbg" event={"ID":"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc","Type":"ContainerStarted","Data":"dc605f31d3b70858f7a8412c5a325028b1648af0d83ea1155646df08a63f7ce5"} Oct 10 07:08:21 crc kubenswrapper[4732]: I1010 07:08:21.058611 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:08:21 crc kubenswrapper[4732]: I1010 07:08:21.066195 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbaa5798-1d07-445a-a226-ad48054d3dbc","Type":"ContainerStarted","Data":"c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d"} Oct 10 07:08:21 crc kubenswrapper[4732]: I1010 07:08:21.096678 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77597f887-bxpbg" podStartSLOduration=4.0691447 podStartE2EDuration="27.096652583s" podCreationTimestamp="2025-10-10 07:07:54 +0000 UTC" firstStartedPulling="2025-10-10 07:07:55.295798884 +0000 UTC m=+1002.365390125" lastFinishedPulling="2025-10-10 07:08:18.323306767 +0000 UTC m=+1025.392898008" observedRunningTime="2025-10-10 07:08:21.09136135 +0000 UTC m=+1028.160952611" watchObservedRunningTime="2025-10-10 07:08:21.096652583 +0000 UTC m=+1028.166243844" Oct 10 07:08:24 crc kubenswrapper[4732]: I1010 07:08:24.089736 4732 generic.go:334] "Generic (PLEG): container finished" podID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerID="4dd666a68eabadb0f0ffc4673eed0a478b682b215b386838970de85ec6a14574" exitCode=0 Oct 10 07:08:24 crc kubenswrapper[4732]: I1010 07:08:24.089831 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"63706a24-ebfd-45ae-96b0-49ab7bd13fdf","Type":"ContainerDied","Data":"4dd666a68eabadb0f0ffc4673eed0a478b682b215b386838970de85ec6a14574"} Oct 10 07:08:24 crc kubenswrapper[4732]: I1010 07:08:24.092537 4732 generic.go:334] "Generic (PLEG): container finished" podID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerID="c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d" exitCode=0 Oct 10 07:08:24 crc kubenswrapper[4732]: I1010 07:08:24.092609 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbaa5798-1d07-445a-a226-ad48054d3dbc","Type":"ContainerDied","Data":"c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.109145 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6bee7b7d-832a-4bd7-8efd-db27adf3664a","Type":"ContainerStarted","Data":"8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.109786 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.112701 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk" event={"ID":"b8c3140a-2ab2-44f7-9ddd-73de883c4b65","Type":"ContainerStarted","Data":"5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.112769 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lzkzk" Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.114834 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac6dedf8-3428-4444-86f9-4f25c0b916e3","Type":"ContainerStarted","Data":"ed3b3ed20134b6ded19f5781525993fc3a4c2c1ec91fdf5fb5344ce971381d40"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.116327 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"64dcf265-8f29-46bc-9b03-40dda51f606b","Type":"ContainerStarted","Data":"9babba80be3c1a6fd055e84387fb3c74f74af8c92bd4f83ebb53cf7d2b84b84d"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.117898 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ab930cd4-caad-4980-a491-8f6c5abca8bf","Type":"ContainerStarted","Data":"9ccbb101b60a4c5d1f4f9801ccb65f5ad73e384c6a69193236f1dbf249839c94"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.118487 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.134232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbaa5798-1d07-445a-a226-ad48054d3dbc","Type":"ContainerStarted","Data":"eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.141410 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"63706a24-ebfd-45ae-96b0-49ab7bd13fdf","Type":"ContainerStarted","Data":"61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.151788 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.971971231 podStartE2EDuration="25.151769284s" podCreationTimestamp="2025-10-10 07:08:00 +0000 UTC" firstStartedPulling="2025-10-10 07:08:18.952067073 +0000 UTC m=+1026.021658314" lastFinishedPulling="2025-10-10 07:08:24.131865126 +0000 UTC m=+1031.201456367" observedRunningTime="2025-10-10 07:08:25.12501541 +0000 UTC m=+1032.194606661" watchObservedRunningTime="2025-10-10 07:08:25.151769284 +0000 UTC m=+1032.221360525" Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.152097 4732 generic.go:334] "Generic (PLEG): container finished" podID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerID="1dd64e8398b848e20a4d4c6259005890874782848e63e18e0c5e0a41ed9ec4d4" exitCode=0 Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.152166 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.861037191 podStartE2EDuration="27.152161934s" podCreationTimestamp="2025-10-10 07:07:58 +0000 UTC" firstStartedPulling="2025-10-10 07:08:18.309818782 +0000 UTC m=+1025.379410043" lastFinishedPulling="2025-10-10 07:08:23.600943545 +0000 UTC m=+1030.670534786" observedRunningTime="2025-10-10 07:08:25.144074976 +0000 UTC m=+1032.213666217" watchObservedRunningTime="2025-10-10 07:08:25.152161934 +0000 UTC m=+1032.221753175" Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.152179 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" event={"ID":"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4","Type":"ContainerDied","Data":"1dd64e8398b848e20a4d4c6259005890874782848e63e18e0c5e0a41ed9ec4d4"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.154795 4732 generic.go:334] "Generic (PLEG): container finished" podID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerID="b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631" exitCode=0 Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.154831 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9v88" event={"ID":"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030","Type":"ContainerDied","Data":"b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631"} Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.167909 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lzkzk" podStartSLOduration=15.957450548 podStartE2EDuration="21.167888329s" podCreationTimestamp="2025-10-10 07:08:04 +0000 UTC" firstStartedPulling="2025-10-10 07:08:18.844375102 +0000 UTC m=+1025.913966343" lastFinishedPulling="2025-10-10 07:08:24.054812883 +0000 UTC m=+1031.124404124" observedRunningTime="2025-10-10 07:08:25.162593206 +0000 UTC m=+1032.232184477" watchObservedRunningTime="2025-10-10 07:08:25.167888329 +0000 UTC m=+1032.237479590" Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.208152 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=12.709978625 podStartE2EDuration="29.208131217s" podCreationTimestamp="2025-10-10 07:07:56 +0000 UTC" firstStartedPulling="2025-10-10 07:08:03.413630893 +0000 UTC m=+1010.483222154" lastFinishedPulling="2025-10-10 07:08:19.911783495 +0000 UTC m=+1026.981374746" observedRunningTime="2025-10-10 07:08:25.200987144 +0000 UTC m=+1032.270578395" watchObservedRunningTime="2025-10-10 07:08:25.208131217 +0000 UTC m=+1032.277722458" Oct 10 07:08:25 crc kubenswrapper[4732]: I1010 07:08:25.244138 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=27.640433551 podStartE2EDuration="29.24410593s" podCreationTimestamp="2025-10-10 07:07:56 +0000 UTC" firstStartedPulling="2025-10-10 07:08:18.309225496 +0000 UTC m=+1025.378816737" lastFinishedPulling="2025-10-10 07:08:19.912897875 +0000 UTC m=+1026.982489116" observedRunningTime="2025-10-10 07:08:25.237394408 +0000 UTC m=+1032.306985649" watchObservedRunningTime="2025-10-10 07:08:25.24410593 +0000 UTC m=+1032.313697171" Oct 10 07:08:26 crc kubenswrapper[4732]: I1010 07:08:26.164812 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" event={"ID":"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4","Type":"ContainerStarted","Data":"80e48c3afc29c45a62cbc434381a685ea5e3e1e6d9ebd30a081d473871110b1f"} Oct 10 07:08:26 crc kubenswrapper[4732]: I1010 07:08:26.165586 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:08:26 crc kubenswrapper[4732]: I1010 07:08:26.168027 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9v88" event={"ID":"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030","Type":"ContainerStarted","Data":"004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5"} Oct 10 07:08:26 crc kubenswrapper[4732]: I1010 07:08:26.168079 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9v88" event={"ID":"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030","Type":"ContainerStarted","Data":"0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871"} Oct 10 07:08:26 crc kubenswrapper[4732]: I1010 07:08:26.186055 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" podStartSLOduration=-9223372004.668741 podStartE2EDuration="32.186033761s" podCreationTimestamp="2025-10-10 07:07:54 +0000 UTC" firstStartedPulling="2025-10-10 07:07:54.786749184 +0000 UTC m=+1001.856340425" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:08:26.182874065 +0000 UTC m=+1033.252465316" watchObservedRunningTime="2025-10-10 07:08:26.186033761 +0000 UTC m=+1033.255625002" Oct 10 07:08:26 crc kubenswrapper[4732]: I1010 07:08:26.205887 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-n9v88" podStartSLOduration=18.036421954 podStartE2EDuration="22.205870657s" podCreationTimestamp="2025-10-10 07:08:04 +0000 UTC" firstStartedPulling="2025-10-10 07:08:19.880847958 +0000 UTC m=+1026.950439219" lastFinishedPulling="2025-10-10 07:08:24.050296681 +0000 UTC m=+1031.119887922" observedRunningTime="2025-10-10 07:08:26.200174233 +0000 UTC m=+1033.269765474" watchObservedRunningTime="2025-10-10 07:08:26.205870657 +0000 UTC m=+1033.275461898" Oct 10 07:08:27 crc kubenswrapper[4732]: I1010 07:08:27.178397 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:27 crc kubenswrapper[4732]: I1010 07:08:27.179013 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.192221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac6dedf8-3428-4444-86f9-4f25c0b916e3","Type":"ContainerStarted","Data":"3368687e64b8c0c613b949dc899a5ce3a9150e48d0028af9fccd8eb195d75f5b"} Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.195837 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"64dcf265-8f29-46bc-9b03-40dda51f606b","Type":"ContainerStarted","Data":"695c8e21da8fa77374077cd2cc05c6d275fab9cd31581217ca2977b370adcc19"} Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.221541 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=14.911359073 podStartE2EDuration="23.221525532s" podCreationTimestamp="2025-10-10 07:08:05 +0000 UTC" firstStartedPulling="2025-10-10 07:08:18.884245779 +0000 UTC m=+1025.953837020" lastFinishedPulling="2025-10-10 07:08:27.194412238 +0000 UTC m=+1034.264003479" observedRunningTime="2025-10-10 07:08:28.218198472 +0000 UTC m=+1035.287789713" watchObservedRunningTime="2025-10-10 07:08:28.221525532 +0000 UTC m=+1035.291116773" Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.248155 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=16.919030981 podStartE2EDuration="24.248138961s" podCreationTimestamp="2025-10-10 07:08:04 +0000 UTC" firstStartedPulling="2025-10-10 07:08:19.880071057 +0000 UTC m=+1026.949662308" lastFinishedPulling="2025-10-10 07:08:27.209179047 +0000 UTC m=+1034.278770288" observedRunningTime="2025-10-10 07:08:28.243820724 +0000 UTC m=+1035.313411965" watchObservedRunningTime="2025-10-10 07:08:28.248138961 +0000 UTC m=+1035.317730202" Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.256876 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.256939 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.270378 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 10 07:08:28 crc kubenswrapper[4732]: I1010 07:08:28.270434 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 10 07:08:29 crc kubenswrapper[4732]: E1010 07:08:29.811397 4732 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.246:52864->38.102.83.246:40175: write tcp 38.102.83.246:52864->38.102.83.246:40175: write: broken pipe Oct 10 07:08:29 crc kubenswrapper[4732]: I1010 07:08:29.837243 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:08:29 crc kubenswrapper[4732]: I1010 07:08:29.901155 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-k4q5x"] Oct 10 07:08:29 crc kubenswrapper[4732]: I1010 07:08:29.901361 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerName="dnsmasq-dns" containerID="cri-o://80e48c3afc29c45a62cbc434381a685ea5e3e1e6d9ebd30a081d473871110b1f" gracePeriod=10 Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.134252 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.197000 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.224895 4732 generic.go:334] "Generic (PLEG): container finished" podID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerID="80e48c3afc29c45a62cbc434381a685ea5e3e1e6d9ebd30a081d473871110b1f" exitCode=0 Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.225081 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" event={"ID":"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4","Type":"ContainerDied","Data":"80e48c3afc29c45a62cbc434381a685ea5e3e1e6d9ebd30a081d473871110b1f"} Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.225222 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.261813 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.301063 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.316290 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.387741 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.412974 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-config\") pod \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.413118 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-dns-svc\") pod \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.413210 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnmx\" (UniqueName: \"kubernetes.io/projected/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-kube-api-access-rjnmx\") pod \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\" (UID: \"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4\") " Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.425953 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-kube-api-access-rjnmx" (OuterVolumeSpecName: "kube-api-access-rjnmx") pod "a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" (UID: "a3d19ccd-5bea-4c1b-9ae0-241606d91aa4"). InnerVolumeSpecName "kube-api-access-rjnmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.515952 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnmx\" (UniqueName: \"kubernetes.io/projected/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-kube-api-access-rjnmx\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.516823 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-config" (OuterVolumeSpecName: "config") pod "a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" (UID: "a3d19ccd-5bea-4c1b-9ae0-241606d91aa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.521520 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-8vv2z"] Oct 10 07:08:30 crc kubenswrapper[4732]: E1010 07:08:30.521960 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerName="dnsmasq-dns" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.521983 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerName="dnsmasq-dns" Oct 10 07:08:30 crc kubenswrapper[4732]: E1010 07:08:30.522014 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerName="init" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.522026 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerName="init" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.522230 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" containerName="dnsmasq-dns" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.523297 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.525575 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.527551 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" (UID: "a3d19ccd-5bea-4c1b-9ae0-241606d91aa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.528827 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-8vv2z"] Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.559343 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-qnzwq"] Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.560852 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.563103 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.565798 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qnzwq"] Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617628 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovn-rundir\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617704 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-config\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617747 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5bbl\" (UniqueName: \"kubernetes.io/projected/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-kube-api-access-m5bbl\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617780 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-dns-svc\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617805 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovs-rundir\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617826 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxg8j\" (UniqueName: \"kubernetes.io/projected/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-kube-api-access-nxg8j\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617853 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-config\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.617893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.618028 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-ovsdbserver-sb\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.618078 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-combined-ca-bundle\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.618162 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.618277 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.633830 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-combined-ca-bundle\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719420 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovn-rundir\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-config\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719478 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5bbl\" (UniqueName: \"kubernetes.io/projected/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-kube-api-access-m5bbl\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719503 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-dns-svc\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719522 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovs-rundir\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxg8j\" (UniqueName: \"kubernetes.io/projected/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-kube-api-access-nxg8j\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719558 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-config\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719583 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719639 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-ovsdbserver-sb\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.719868 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovn-rundir\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.720029 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovs-rundir\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.720294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-config\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.720398 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-dns-svc\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.720540 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-ovsdbserver-sb\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.720635 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-config\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.723674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.739355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-combined-ca-bundle\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.751422 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5bbl\" (UniqueName: \"kubernetes.io/projected/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-kube-api-access-m5bbl\") pod \"ovn-controller-metrics-qnzwq\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.751818 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-8vv2z"] Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.752296 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxg8j\" (UniqueName: \"kubernetes.io/projected/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-kube-api-access-nxg8j\") pod \"dnsmasq-dns-54c9499b4f-8vv2z\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:30 crc kubenswrapper[4732]: E1010 07:08:30.752426 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-nxg8j], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" podUID="83f3a474-b986-42a8-9d19-5fa3ac90fb7d" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.769194 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-pbr7w"] Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.773471 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.776781 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.798112 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-pbr7w"] Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.821535 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-config\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.821637 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-dns-svc\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.821690 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-nb\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.821744 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsxxj\" (UniqueName: \"kubernetes.io/projected/7a563ba5-1973-4d09-96c3-e0a88d3f6586-kube-api-access-wsxxj\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.821792 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-sb\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.879148 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.923111 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-config\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.923181 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-dns-svc\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.923217 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-nb\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.923242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsxxj\" (UniqueName: \"kubernetes.io/projected/7a563ba5-1973-4d09-96c3-e0a88d3f6586-kube-api-access-wsxxj\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.923284 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-sb\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.924161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-sb\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.924499 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-dns-svc\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.925114 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-nb\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.927472 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-config\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.940914 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsxxj\" (UniqueName: \"kubernetes.io/projected/7a563ba5-1973-4d09-96c3-e0a88d3f6586-kube-api-access-wsxxj\") pod \"dnsmasq-dns-bc45f6dcf-pbr7w\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:30 crc kubenswrapper[4732]: I1010 07:08:30.962725 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.084004 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.100114 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.237171 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.237186 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.237261 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644597f84c-k4q5x" event={"ID":"a3d19ccd-5bea-4c1b-9ae0-241606d91aa4","Type":"ContainerDied","Data":"a93a879b56edabc21fb6f3ca793103fed86cb4eb8b009125ffc2f0465515c478"} Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.237437 4732 scope.go:117] "RemoveContainer" containerID="80e48c3afc29c45a62cbc434381a685ea5e3e1e6d9ebd30a081d473871110b1f" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.238187 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.253821 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.282769 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-k4q5x"] Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.284585 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.285653 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644597f84c-k4q5x"] Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.285952 4732 scope.go:117] "RemoveContainer" containerID="1dd64e8398b848e20a4d4c6259005890874782848e63e18e0c5e0a41ed9ec4d4" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.357227 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-qnzwq"] Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.384027 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-pbr7w"] Oct 10 07:08:31 crc kubenswrapper[4732]: W1010 07:08:31.389753 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a563ba5_1973_4d09_96c3_e0a88d3f6586.slice/crio-0b03eeb0dc0e49b33015e57fd09e95e044b5b3ddb4906cdb3204e27097786e5f WatchSource:0}: Error finding container 0b03eeb0dc0e49b33015e57fd09e95e044b5b3ddb4906cdb3204e27097786e5f: Status 404 returned error can't find the container with id 0b03eeb0dc0e49b33015e57fd09e95e044b5b3ddb4906cdb3204e27097786e5f Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.435654 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-config\") pod \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.435808 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxg8j\" (UniqueName: \"kubernetes.io/projected/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-kube-api-access-nxg8j\") pod \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.435897 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-ovsdbserver-sb\") pod \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.436018 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-dns-svc\") pod \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\" (UID: \"83f3a474-b986-42a8-9d19-5fa3ac90fb7d\") " Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.436118 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-config" (OuterVolumeSpecName: "config") pod "83f3a474-b986-42a8-9d19-5fa3ac90fb7d" (UID: "83f3a474-b986-42a8-9d19-5fa3ac90fb7d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.436812 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.437620 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83f3a474-b986-42a8-9d19-5fa3ac90fb7d" (UID: "83f3a474-b986-42a8-9d19-5fa3ac90fb7d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.437663 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83f3a474-b986-42a8-9d19-5fa3ac90fb7d" (UID: "83f3a474-b986-42a8-9d19-5fa3ac90fb7d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.439769 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-kube-api-access-nxg8j" (OuterVolumeSpecName: "kube-api-access-nxg8j") pod "83f3a474-b986-42a8-9d19-5fa3ac90fb7d" (UID: "83f3a474-b986-42a8-9d19-5fa3ac90fb7d"). InnerVolumeSpecName "kube-api-access-nxg8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.505346 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.507252 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.511463 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.511725 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.511973 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.512220 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ws5gm" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.525793 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.539987 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxg8j\" (UniqueName: \"kubernetes.io/projected/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-kube-api-access-nxg8j\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.540023 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.540033 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83f3a474-b986-42a8-9d19-5fa3ac90fb7d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.641754 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.641818 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6hnt\" (UniqueName: \"kubernetes.io/projected/99f1b967-cc4d-4092-87e9-64cbbc84be27-kube-api-access-c6hnt\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.641867 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-scripts\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.642044 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-config\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.642156 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.642246 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.642349 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.668914 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d19ccd-5bea-4c1b-9ae0-241606d91aa4" path="/var/lib/kubelet/pods/a3d19ccd-5bea-4c1b-9ae0-241606d91aa4/volumes" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.744463 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.744540 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.745383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6hnt\" (UniqueName: \"kubernetes.io/projected/99f1b967-cc4d-4092-87e9-64cbbc84be27-kube-api-access-c6hnt\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.745441 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-scripts\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.745479 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-config\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.745537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.745592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.746406 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.746500 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-config\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.746583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-scripts\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.749253 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.749334 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.749784 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.763829 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6hnt\" (UniqueName: \"kubernetes.io/projected/99f1b967-cc4d-4092-87e9-64cbbc84be27-kube-api-access-c6hnt\") pod \"ovn-northd-0\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " pod="openstack/ovn-northd-0" Oct 10 07:08:31 crc kubenswrapper[4732]: I1010 07:08:31.855475 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.244624 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qnzwq" event={"ID":"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8","Type":"ContainerStarted","Data":"6a693cf87726dd07aef0243f81c2ca77c5d4545a90e8f3f043f685eaf87b6af5"} Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.245024 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qnzwq" event={"ID":"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8","Type":"ContainerStarted","Data":"1b12750afa68cee82cea406fff26411748a1b05ec3a9a01d365a12c3b3a9f74e"} Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.246440 4732 generic.go:334] "Generic (PLEG): container finished" podID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerID="9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e" exitCode=0 Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.246488 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" event={"ID":"7a563ba5-1973-4d09-96c3-e0a88d3f6586","Type":"ContainerDied","Data":"9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e"} Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.246508 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" event={"ID":"7a563ba5-1973-4d09-96c3-e0a88d3f6586","Type":"ContainerStarted","Data":"0b03eeb0dc0e49b33015e57fd09e95e044b5b3ddb4906cdb3204e27097786e5f"} Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.247828 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54c9499b4f-8vv2z" Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.263532 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-qnzwq" podStartSLOduration=2.263513471 podStartE2EDuration="2.263513471s" podCreationTimestamp="2025-10-10 07:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:08:32.262395631 +0000 UTC m=+1039.331986882" watchObservedRunningTime="2025-10-10 07:08:32.263513471 +0000 UTC m=+1039.333104712" Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.316740 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.340898 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-8vv2z"] Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.353727 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54c9499b4f-8vv2z"] Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.358800 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 10 07:08:32 crc kubenswrapper[4732]: I1010 07:08:32.410739 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 10 07:08:33 crc kubenswrapper[4732]: I1010 07:08:33.269040 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" event={"ID":"7a563ba5-1973-4d09-96c3-e0a88d3f6586","Type":"ContainerStarted","Data":"0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72"} Oct 10 07:08:33 crc kubenswrapper[4732]: I1010 07:08:33.269350 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:33 crc kubenswrapper[4732]: I1010 07:08:33.270181 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1b967-cc4d-4092-87e9-64cbbc84be27","Type":"ContainerStarted","Data":"63367abe9400d87b65d7cb4a168835c3c70c96a9997d7ffed71066f71459682c"} Oct 10 07:08:33 crc kubenswrapper[4732]: I1010 07:08:33.287522 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" podStartSLOduration=3.2874999320000002 podStartE2EDuration="3.287499932s" podCreationTimestamp="2025-10-10 07:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:08:33.283119062 +0000 UTC m=+1040.352710313" watchObservedRunningTime="2025-10-10 07:08:33.287499932 +0000 UTC m=+1040.357091183" Oct 10 07:08:33 crc kubenswrapper[4732]: I1010 07:08:33.674220 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f3a474-b986-42a8-9d19-5fa3ac90fb7d" path="/var/lib/kubelet/pods/83f3a474-b986-42a8-9d19-5fa3ac90fb7d/volumes" Oct 10 07:08:33 crc kubenswrapper[4732]: I1010 07:08:33.861818 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.180726 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ps9wp"] Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.183606 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps9wp" Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.190291 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ps9wp"] Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.279247 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1b967-cc4d-4092-87e9-64cbbc84be27","Type":"ContainerStarted","Data":"1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a"} Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.279298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1b967-cc4d-4092-87e9-64cbbc84be27","Type":"ContainerStarted","Data":"47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43"} Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.290824 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqdvk\" (UniqueName: \"kubernetes.io/projected/6d361b6b-d6cf-44c8-ba94-5cbba8dae55e-kube-api-access-jqdvk\") pod \"glance-db-create-ps9wp\" (UID: \"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e\") " pod="openstack/glance-db-create-ps9wp" Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.302409 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.289971839 podStartE2EDuration="3.302390484s" podCreationTimestamp="2025-10-10 07:08:31 +0000 UTC" firstStartedPulling="2025-10-10 07:08:32.331579194 +0000 UTC m=+1039.401170435" lastFinishedPulling="2025-10-10 07:08:33.343997839 +0000 UTC m=+1040.413589080" observedRunningTime="2025-10-10 07:08:34.297354917 +0000 UTC m=+1041.366946168" watchObservedRunningTime="2025-10-10 07:08:34.302390484 +0000 UTC m=+1041.371981725" Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.395934 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqdvk\" (UniqueName: \"kubernetes.io/projected/6d361b6b-d6cf-44c8-ba94-5cbba8dae55e-kube-api-access-jqdvk\") pod \"glance-db-create-ps9wp\" (UID: \"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e\") " pod="openstack/glance-db-create-ps9wp" Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.413514 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqdvk\" (UniqueName: \"kubernetes.io/projected/6d361b6b-d6cf-44c8-ba94-5cbba8dae55e-kube-api-access-jqdvk\") pod \"glance-db-create-ps9wp\" (UID: \"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e\") " pod="openstack/glance-db-create-ps9wp" Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.504748 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps9wp" Oct 10 07:08:34 crc kubenswrapper[4732]: I1010 07:08:34.912103 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ps9wp"] Oct 10 07:08:34 crc kubenswrapper[4732]: W1010 07:08:34.918718 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d361b6b_d6cf_44c8_ba94_5cbba8dae55e.slice/crio-7b224ec7ee061169a35a398b9eaebb1372750579aa6bece87673a58a2aa3fbcd WatchSource:0}: Error finding container 7b224ec7ee061169a35a398b9eaebb1372750579aa6bece87673a58a2aa3fbcd: Status 404 returned error can't find the container with id 7b224ec7ee061169a35a398b9eaebb1372750579aa6bece87673a58a2aa3fbcd Oct 10 07:08:35 crc kubenswrapper[4732]: I1010 07:08:35.288076 4732 generic.go:334] "Generic (PLEG): container finished" podID="6d361b6b-d6cf-44c8-ba94-5cbba8dae55e" containerID="aec28078f83d5c42b3c9af9c5e0ca7f503376268d645f1072d4e6fcefa958f9b" exitCode=0 Oct 10 07:08:35 crc kubenswrapper[4732]: I1010 07:08:35.288141 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ps9wp" event={"ID":"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e","Type":"ContainerDied","Data":"aec28078f83d5c42b3c9af9c5e0ca7f503376268d645f1072d4e6fcefa958f9b"} Oct 10 07:08:35 crc kubenswrapper[4732]: I1010 07:08:35.288397 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ps9wp" event={"ID":"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e","Type":"ContainerStarted","Data":"7b224ec7ee061169a35a398b9eaebb1372750579aa6bece87673a58a2aa3fbcd"} Oct 10 07:08:35 crc kubenswrapper[4732]: I1010 07:08:35.288492 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 10 07:08:36 crc kubenswrapper[4732]: I1010 07:08:36.667836 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps9wp" Oct 10 07:08:36 crc kubenswrapper[4732]: I1010 07:08:36.735429 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqdvk\" (UniqueName: \"kubernetes.io/projected/6d361b6b-d6cf-44c8-ba94-5cbba8dae55e-kube-api-access-jqdvk\") pod \"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e\" (UID: \"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e\") " Oct 10 07:08:36 crc kubenswrapper[4732]: I1010 07:08:36.741584 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d361b6b-d6cf-44c8-ba94-5cbba8dae55e-kube-api-access-jqdvk" (OuterVolumeSpecName: "kube-api-access-jqdvk") pod "6d361b6b-d6cf-44c8-ba94-5cbba8dae55e" (UID: "6d361b6b-d6cf-44c8-ba94-5cbba8dae55e"). InnerVolumeSpecName "kube-api-access-jqdvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:36 crc kubenswrapper[4732]: I1010 07:08:36.837104 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqdvk\" (UniqueName: \"kubernetes.io/projected/6d361b6b-d6cf-44c8-ba94-5cbba8dae55e-kube-api-access-jqdvk\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:37 crc kubenswrapper[4732]: I1010 07:08:37.314226 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ps9wp" event={"ID":"6d361b6b-d6cf-44c8-ba94-5cbba8dae55e","Type":"ContainerDied","Data":"7b224ec7ee061169a35a398b9eaebb1372750579aa6bece87673a58a2aa3fbcd"} Oct 10 07:08:37 crc kubenswrapper[4732]: I1010 07:08:37.314272 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ps9wp" Oct 10 07:08:37 crc kubenswrapper[4732]: I1010 07:08:37.314274 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b224ec7ee061169a35a398b9eaebb1372750579aa6bece87673a58a2aa3fbcd" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.446109 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vhn47"] Oct 10 07:08:38 crc kubenswrapper[4732]: E1010 07:08:38.446844 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d361b6b-d6cf-44c8-ba94-5cbba8dae55e" containerName="mariadb-database-create" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.446862 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d361b6b-d6cf-44c8-ba94-5cbba8dae55e" containerName="mariadb-database-create" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.447055 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d361b6b-d6cf-44c8-ba94-5cbba8dae55e" containerName="mariadb-database-create" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.449177 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhn47" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.455671 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhn47"] Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.566405 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh56m\" (UniqueName: \"kubernetes.io/projected/9521a409-ec60-4dfb-b864-6fe4156581bf-kube-api-access-gh56m\") pod \"keystone-db-create-vhn47\" (UID: \"9521a409-ec60-4dfb-b864-6fe4156581bf\") " pod="openstack/keystone-db-create-vhn47" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.668232 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh56m\" (UniqueName: \"kubernetes.io/projected/9521a409-ec60-4dfb-b864-6fe4156581bf-kube-api-access-gh56m\") pod \"keystone-db-create-vhn47\" (UID: \"9521a409-ec60-4dfb-b864-6fe4156581bf\") " pod="openstack/keystone-db-create-vhn47" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.686410 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh56m\" (UniqueName: \"kubernetes.io/projected/9521a409-ec60-4dfb-b864-6fe4156581bf-kube-api-access-gh56m\") pod \"keystone-db-create-vhn47\" (UID: \"9521a409-ec60-4dfb-b864-6fe4156581bf\") " pod="openstack/keystone-db-create-vhn47" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.777220 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhn47" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.788481 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6tnxp"] Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.789846 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6tnxp" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.797825 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6tnxp"] Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.873302 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btptg\" (UniqueName: \"kubernetes.io/projected/ce79fc9b-55ca-4b99-adef-500f3cf92f81-kube-api-access-btptg\") pod \"placement-db-create-6tnxp\" (UID: \"ce79fc9b-55ca-4b99-adef-500f3cf92f81\") " pod="openstack/placement-db-create-6tnxp" Oct 10 07:08:38 crc kubenswrapper[4732]: I1010 07:08:38.977356 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btptg\" (UniqueName: \"kubernetes.io/projected/ce79fc9b-55ca-4b99-adef-500f3cf92f81-kube-api-access-btptg\") pod \"placement-db-create-6tnxp\" (UID: \"ce79fc9b-55ca-4b99-adef-500f3cf92f81\") " pod="openstack/placement-db-create-6tnxp" Oct 10 07:08:39 crc kubenswrapper[4732]: I1010 07:08:39.006382 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btptg\" (UniqueName: \"kubernetes.io/projected/ce79fc9b-55ca-4b99-adef-500f3cf92f81-kube-api-access-btptg\") pod \"placement-db-create-6tnxp\" (UID: \"ce79fc9b-55ca-4b99-adef-500f3cf92f81\") " pod="openstack/placement-db-create-6tnxp" Oct 10 07:08:39 crc kubenswrapper[4732]: I1010 07:08:39.207041 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6tnxp" Oct 10 07:08:39 crc kubenswrapper[4732]: I1010 07:08:39.262310 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhn47"] Oct 10 07:08:39 crc kubenswrapper[4732]: I1010 07:08:39.331268 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhn47" event={"ID":"9521a409-ec60-4dfb-b864-6fe4156581bf","Type":"ContainerStarted","Data":"9e1aaaa2295ac36c1ec61ba735d71c9fb27f26a4b76d3b0b480733c5ff4d663e"} Oct 10 07:08:39 crc kubenswrapper[4732]: I1010 07:08:39.629111 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6tnxp"] Oct 10 07:08:39 crc kubenswrapper[4732]: W1010 07:08:39.633786 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce79fc9b_55ca_4b99_adef_500f3cf92f81.slice/crio-8bb68af9dec0b650bb3262e1eed92f3a1d17d86edf3d7ac0a26b8aacd0897cc8 WatchSource:0}: Error finding container 8bb68af9dec0b650bb3262e1eed92f3a1d17d86edf3d7ac0a26b8aacd0897cc8: Status 404 returned error can't find the container with id 8bb68af9dec0b650bb3262e1eed92f3a1d17d86edf3d7ac0a26b8aacd0897cc8 Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.352198 4732 generic.go:334] "Generic (PLEG): container finished" podID="9521a409-ec60-4dfb-b864-6fe4156581bf" containerID="f2f4bd53eac7d02bf28780a04513f842ce166a920111292a88480652aad2eaf7" exitCode=0 Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.352257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhn47" event={"ID":"9521a409-ec60-4dfb-b864-6fe4156581bf","Type":"ContainerDied","Data":"f2f4bd53eac7d02bf28780a04513f842ce166a920111292a88480652aad2eaf7"} Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.355343 4732 generic.go:334] "Generic (PLEG): container finished" podID="ce79fc9b-55ca-4b99-adef-500f3cf92f81" containerID="3c1c2951ed8b32ebb3e4b066da8fbc88f5d886c4680c8466ab93868d6060fee4" exitCode=0 Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.355401 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6tnxp" event={"ID":"ce79fc9b-55ca-4b99-adef-500f3cf92f81","Type":"ContainerDied","Data":"3c1c2951ed8b32ebb3e4b066da8fbc88f5d886c4680c8466ab93868d6060fee4"} Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.355760 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6tnxp" event={"ID":"ce79fc9b-55ca-4b99-adef-500f3cf92f81","Type":"ContainerStarted","Data":"8bb68af9dec0b650bb3262e1eed92f3a1d17d86edf3d7ac0a26b8aacd0897cc8"} Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.718551 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-pbr7w"] Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.718848 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerName="dnsmasq-dns" containerID="cri-o://0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72" gracePeriod=10 Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.723441 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.751189 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-ktcjd"] Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.756461 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.773600 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-ktcjd"] Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.808796 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-sb\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.808863 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-config\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.808913 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-nb\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.808932 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppdz\" (UniqueName: \"kubernetes.io/projected/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-kube-api-access-5ppdz\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.808955 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-dns-svc\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.910398 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-sb\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.910513 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-config\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.911355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-sb\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.911378 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-config\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.911468 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-nb\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.911518 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppdz\" (UniqueName: \"kubernetes.io/projected/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-kube-api-access-5ppdz\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.912113 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-nb\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.912173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-dns-svc\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.912787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-dns-svc\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:40 crc kubenswrapper[4732]: I1010 07:08:40.931754 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppdz\" (UniqueName: \"kubernetes.io/projected/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-kube-api-access-5ppdz\") pod \"dnsmasq-dns-57f58c7cff-ktcjd\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.081245 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.207835 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.319636 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-nb\") pod \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.320128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-sb\") pod \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.320254 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsxxj\" (UniqueName: \"kubernetes.io/projected/7a563ba5-1973-4d09-96c3-e0a88d3f6586-kube-api-access-wsxxj\") pod \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.320333 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-config\") pod \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.320462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-dns-svc\") pod \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\" (UID: \"7a563ba5-1973-4d09-96c3-e0a88d3f6586\") " Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.326048 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a563ba5-1973-4d09-96c3-e0a88d3f6586-kube-api-access-wsxxj" (OuterVolumeSpecName: "kube-api-access-wsxxj") pod "7a563ba5-1973-4d09-96c3-e0a88d3f6586" (UID: "7a563ba5-1973-4d09-96c3-e0a88d3f6586"). InnerVolumeSpecName "kube-api-access-wsxxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.364701 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a563ba5-1973-4d09-96c3-e0a88d3f6586" (UID: "7a563ba5-1973-4d09-96c3-e0a88d3f6586"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.365472 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a563ba5-1973-4d09-96c3-e0a88d3f6586" (UID: "7a563ba5-1973-4d09-96c3-e0a88d3f6586"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.365957 4732 generic.go:334] "Generic (PLEG): container finished" podID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerID="0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72" exitCode=0 Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.366105 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.366020 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" event={"ID":"7a563ba5-1973-4d09-96c3-e0a88d3f6586","Type":"ContainerDied","Data":"0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72"} Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.366191 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" event={"ID":"7a563ba5-1973-4d09-96c3-e0a88d3f6586","Type":"ContainerDied","Data":"0b03eeb0dc0e49b33015e57fd09e95e044b5b3ddb4906cdb3204e27097786e5f"} Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.366218 4732 scope.go:117] "RemoveContainer" containerID="0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.367246 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-config" (OuterVolumeSpecName: "config") pod "7a563ba5-1973-4d09-96c3-e0a88d3f6586" (UID: "7a563ba5-1973-4d09-96c3-e0a88d3f6586"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.397935 4732 scope.go:117] "RemoveContainer" containerID="9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.399539 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a563ba5-1973-4d09-96c3-e0a88d3f6586" (UID: "7a563ba5-1973-4d09-96c3-e0a88d3f6586"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.419730 4732 scope.go:117] "RemoveContainer" containerID="0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72" Oct 10 07:08:41 crc kubenswrapper[4732]: E1010 07:08:41.421619 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72\": container with ID starting with 0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72 not found: ID does not exist" containerID="0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.421670 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72"} err="failed to get container status \"0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72\": rpc error: code = NotFound desc = could not find container \"0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72\": container with ID starting with 0b6bcb1dca6cfa959e983a971614cca81403cb1d4545dbe2811bf2f77e5a9f72 not found: ID does not exist" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.421720 4732 scope.go:117] "RemoveContainer" containerID="9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e" Oct 10 07:08:41 crc kubenswrapper[4732]: E1010 07:08:41.422050 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e\": container with ID starting with 9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e not found: ID does not exist" containerID="9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.422070 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e"} err="failed to get container status \"9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e\": rpc error: code = NotFound desc = could not find container \"9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e\": container with ID starting with 9efbcfd3d4622e00bf27875dff519de34ac0b8a69c50a8d79f6c05477a436d9e not found: ID does not exist" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.423102 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.423145 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.423157 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsxxj\" (UniqueName: \"kubernetes.io/projected/7a563ba5-1973-4d09-96c3-e0a88d3f6586-kube-api-access-wsxxj\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.423169 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.423178 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a563ba5-1973-4d09-96c3-e0a88d3f6586-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.586982 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-ktcjd"] Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.614931 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6tnxp" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.697223 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhn47" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.714721 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-pbr7w"] Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.721390 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc45f6dcf-pbr7w"] Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.728386 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh56m\" (UniqueName: \"kubernetes.io/projected/9521a409-ec60-4dfb-b864-6fe4156581bf-kube-api-access-gh56m\") pod \"9521a409-ec60-4dfb-b864-6fe4156581bf\" (UID: \"9521a409-ec60-4dfb-b864-6fe4156581bf\") " Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.728449 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btptg\" (UniqueName: \"kubernetes.io/projected/ce79fc9b-55ca-4b99-adef-500f3cf92f81-kube-api-access-btptg\") pod \"ce79fc9b-55ca-4b99-adef-500f3cf92f81\" (UID: \"ce79fc9b-55ca-4b99-adef-500f3cf92f81\") " Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.741201 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9521a409-ec60-4dfb-b864-6fe4156581bf-kube-api-access-gh56m" (OuterVolumeSpecName: "kube-api-access-gh56m") pod "9521a409-ec60-4dfb-b864-6fe4156581bf" (UID: "9521a409-ec60-4dfb-b864-6fe4156581bf"). InnerVolumeSpecName "kube-api-access-gh56m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.743118 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce79fc9b-55ca-4b99-adef-500f3cf92f81-kube-api-access-btptg" (OuterVolumeSpecName: "kube-api-access-btptg") pod "ce79fc9b-55ca-4b99-adef-500f3cf92f81" (UID: "ce79fc9b-55ca-4b99-adef-500f3cf92f81"). InnerVolumeSpecName "kube-api-access-btptg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.831583 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh56m\" (UniqueName: \"kubernetes.io/projected/9521a409-ec60-4dfb-b864-6fe4156581bf-kube-api-access-gh56m\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.831636 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btptg\" (UniqueName: \"kubernetes.io/projected/ce79fc9b-55ca-4b99-adef-500f3cf92f81-kube-api-access-btptg\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.907712 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 10 07:08:41 crc kubenswrapper[4732]: E1010 07:08:41.908153 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerName="init" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.908171 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerName="init" Oct 10 07:08:41 crc kubenswrapper[4732]: E1010 07:08:41.908207 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9521a409-ec60-4dfb-b864-6fe4156581bf" containerName="mariadb-database-create" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.908217 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9521a409-ec60-4dfb-b864-6fe4156581bf" containerName="mariadb-database-create" Oct 10 07:08:41 crc kubenswrapper[4732]: E1010 07:08:41.908228 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerName="dnsmasq-dns" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.908236 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerName="dnsmasq-dns" Oct 10 07:08:41 crc kubenswrapper[4732]: E1010 07:08:41.908257 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce79fc9b-55ca-4b99-adef-500f3cf92f81" containerName="mariadb-database-create" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.908266 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce79fc9b-55ca-4b99-adef-500f3cf92f81" containerName="mariadb-database-create" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.908450 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerName="dnsmasq-dns" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.908468 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9521a409-ec60-4dfb-b864-6fe4156581bf" containerName="mariadb-database-create" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.908488 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce79fc9b-55ca-4b99-adef-500f3cf92f81" containerName="mariadb-database-create" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.942838 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.943290 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.946791 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.946907 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.947204 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jxs2b" Oct 10 07:08:41 crc kubenswrapper[4732]: I1010 07:08:41.947379 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.035113 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.035571 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-cache\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.035761 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-lock\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.035791 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.035822 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pvkk\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-kube-api-access-7pvkk\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.137454 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-lock\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.137523 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.137562 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pvkk\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-kube-api-access-7pvkk\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.137595 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.137623 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-cache\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: E1010 07:08:42.137967 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 07:08:42 crc kubenswrapper[4732]: E1010 07:08:42.138056 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 07:08:42 crc kubenswrapper[4732]: E1010 07:08:42.138175 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift podName:6ec5be94-f09a-4728-8858-c18fbd9ca2c2 nodeName:}" failed. No retries permitted until 2025-10-10 07:08:42.638148961 +0000 UTC m=+1049.707740202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift") pod "swift-storage-0" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2") : configmap "swift-ring-files" not found Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.138240 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.138610 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-cache\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.138631 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-lock\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.156785 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pvkk\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-kube-api-access-7pvkk\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.159395 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.376611 4732 generic.go:334] "Generic (PLEG): container finished" podID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerID="4f39667823d73599e823341b3ce2ee66d4b6f95631b4a7222a0056f73811dbf8" exitCode=0 Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.376701 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" event={"ID":"b8ffdec9-7aa5-4ae8-b860-f3fad859308c","Type":"ContainerDied","Data":"4f39667823d73599e823341b3ce2ee66d4b6f95631b4a7222a0056f73811dbf8"} Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.376748 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" event={"ID":"b8ffdec9-7aa5-4ae8-b860-f3fad859308c","Type":"ContainerStarted","Data":"3729c87082f24e5168376c3a6a520f06edacfcd72beee8adee185d338d12e5dd"} Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.378569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhn47" event={"ID":"9521a409-ec60-4dfb-b864-6fe4156581bf","Type":"ContainerDied","Data":"9e1aaaa2295ac36c1ec61ba735d71c9fb27f26a4b76d3b0b480733c5ff4d663e"} Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.378595 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e1aaaa2295ac36c1ec61ba735d71c9fb27f26a4b76d3b0b480733c5ff4d663e" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.378758 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhn47" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.379893 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6tnxp" event={"ID":"ce79fc9b-55ca-4b99-adef-500f3cf92f81","Type":"ContainerDied","Data":"8bb68af9dec0b650bb3262e1eed92f3a1d17d86edf3d7ac0a26b8aacd0897cc8"} Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.380029 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bb68af9dec0b650bb3262e1eed92f3a1d17d86edf3d7ac0a26b8aacd0897cc8" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.380158 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6tnxp" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.462314 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-82cjr"] Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.463429 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.470921 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.471311 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.471981 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.476944 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-82cjr"] Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.524228 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-p5t8l"] Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.525478 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.535474 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p5t8l"] Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.549311 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-scripts\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.549493 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/898d7ec3-23c2-40a7-b224-cb69ac84e188-etc-swift\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.549732 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8lz7\" (UniqueName: \"kubernetes.io/projected/898d7ec3-23c2-40a7-b224-cb69ac84e188-kube-api-access-f8lz7\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.549884 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-swiftconf\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.550035 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-combined-ca-bundle\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.550268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-ring-data-devices\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.550382 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-dispersionconf\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.550644 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-ring-data-devices\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.550961 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-swiftconf\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.551096 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lms44\" (UniqueName: \"kubernetes.io/projected/361b3420-23c6-4075-ba01-3db1fd10c1d4-kube-api-access-lms44\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.551238 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-scripts\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.551427 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-combined-ca-bundle\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.551563 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/361b3420-23c6-4075-ba01-3db1fd10c1d4-etc-swift\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.551677 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-dispersionconf\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.561814 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-82cjr"] Oct 10 07:08:42 crc kubenswrapper[4732]: E1010 07:08:42.562518 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-lms44 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-82cjr" podUID="361b3420-23c6-4075-ba01-3db1fd10c1d4" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.652932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-combined-ca-bundle\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.652978 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/361b3420-23c6-4075-ba01-3db1fd10c1d4-etc-swift\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.652997 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-dispersionconf\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653022 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-scripts\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653072 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/898d7ec3-23c2-40a7-b224-cb69ac84e188-etc-swift\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653090 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8lz7\" (UniqueName: \"kubernetes.io/projected/898d7ec3-23c2-40a7-b224-cb69ac84e188-kube-api-access-f8lz7\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653111 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-swiftconf\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653141 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-combined-ca-bundle\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653158 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-ring-data-devices\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-dispersionconf\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-ring-data-devices\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653233 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-swiftconf\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653251 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lms44\" (UniqueName: \"kubernetes.io/projected/361b3420-23c6-4075-ba01-3db1fd10c1d4-kube-api-access-lms44\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.653270 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-scripts\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.654085 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-scripts\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.655312 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-ring-data-devices\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: E1010 07:08:42.655412 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 07:08:42 crc kubenswrapper[4732]: E1010 07:08:42.655441 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 07:08:42 crc kubenswrapper[4732]: E1010 07:08:42.655502 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift podName:6ec5be94-f09a-4728-8858-c18fbd9ca2c2 nodeName:}" failed. No retries permitted until 2025-10-10 07:08:43.655479931 +0000 UTC m=+1050.725071172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift") pod "swift-storage-0" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2") : configmap "swift-ring-files" not found Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.655638 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/898d7ec3-23c2-40a7-b224-cb69ac84e188-etc-swift\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.655782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-ring-data-devices\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.655871 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/361b3420-23c6-4075-ba01-3db1fd10c1d4-etc-swift\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.656146 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-scripts\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.657376 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-swiftconf\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.657839 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-combined-ca-bundle\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.657919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-swiftconf\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.658443 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-dispersionconf\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.658555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-dispersionconf\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.658759 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-combined-ca-bundle\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.671681 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lms44\" (UniqueName: \"kubernetes.io/projected/361b3420-23c6-4075-ba01-3db1fd10c1d4-kube-api-access-lms44\") pod \"swift-ring-rebalance-82cjr\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.679958 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8lz7\" (UniqueName: \"kubernetes.io/projected/898d7ec3-23c2-40a7-b224-cb69ac84e188-kube-api-access-f8lz7\") pod \"swift-ring-rebalance-p5t8l\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:42 crc kubenswrapper[4732]: I1010 07:08:42.888099 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.164522 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-p5t8l"] Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.388516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p5t8l" event={"ID":"898d7ec3-23c2-40a7-b224-cb69ac84e188","Type":"ContainerStarted","Data":"51c0f2e4b455eab47c0fa8a02670fdcb6b35611bc60bc745a0ab4fd94b9b81e6"} Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.390601 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" event={"ID":"b8ffdec9-7aa5-4ae8-b860-f3fad859308c","Type":"ContainerStarted","Data":"aaff43b96fa7dcd4402138faf31aab86679b10bf84ebc281590018b708b6cb13"} Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.390585 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.390967 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.402115 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.413309 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" podStartSLOduration=3.4132954460000002 podStartE2EDuration="3.413295446s" podCreationTimestamp="2025-10-10 07:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:08:43.409927055 +0000 UTC m=+1050.479518326" watchObservedRunningTime="2025-10-10 07:08:43.413295446 +0000 UTC m=+1050.482886687" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.466766 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-dispersionconf\") pod \"361b3420-23c6-4075-ba01-3db1fd10c1d4\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.466829 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-swiftconf\") pod \"361b3420-23c6-4075-ba01-3db1fd10c1d4\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.466856 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lms44\" (UniqueName: \"kubernetes.io/projected/361b3420-23c6-4075-ba01-3db1fd10c1d4-kube-api-access-lms44\") pod \"361b3420-23c6-4075-ba01-3db1fd10c1d4\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.466888 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-combined-ca-bundle\") pod \"361b3420-23c6-4075-ba01-3db1fd10c1d4\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.466923 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/361b3420-23c6-4075-ba01-3db1fd10c1d4-etc-swift\") pod \"361b3420-23c6-4075-ba01-3db1fd10c1d4\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.466952 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-scripts\") pod \"361b3420-23c6-4075-ba01-3db1fd10c1d4\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.466969 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-ring-data-devices\") pod \"361b3420-23c6-4075-ba01-3db1fd10c1d4\" (UID: \"361b3420-23c6-4075-ba01-3db1fd10c1d4\") " Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.467383 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361b3420-23c6-4075-ba01-3db1fd10c1d4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "361b3420-23c6-4075-ba01-3db1fd10c1d4" (UID: "361b3420-23c6-4075-ba01-3db1fd10c1d4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.467505 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "361b3420-23c6-4075-ba01-3db1fd10c1d4" (UID: "361b3420-23c6-4075-ba01-3db1fd10c1d4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.467587 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-scripts" (OuterVolumeSpecName: "scripts") pod "361b3420-23c6-4075-ba01-3db1fd10c1d4" (UID: "361b3420-23c6-4075-ba01-3db1fd10c1d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.468153 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/361b3420-23c6-4075-ba01-3db1fd10c1d4-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.468341 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.468363 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/361b3420-23c6-4075-ba01-3db1fd10c1d4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.471747 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "361b3420-23c6-4075-ba01-3db1fd10c1d4" (UID: "361b3420-23c6-4075-ba01-3db1fd10c1d4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.471821 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "361b3420-23c6-4075-ba01-3db1fd10c1d4" (UID: "361b3420-23c6-4075-ba01-3db1fd10c1d4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.472354 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "361b3420-23c6-4075-ba01-3db1fd10c1d4" (UID: "361b3420-23c6-4075-ba01-3db1fd10c1d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.472799 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361b3420-23c6-4075-ba01-3db1fd10c1d4-kube-api-access-lms44" (OuterVolumeSpecName: "kube-api-access-lms44") pod "361b3420-23c6-4075-ba01-3db1fd10c1d4" (UID: "361b3420-23c6-4075-ba01-3db1fd10c1d4"). InnerVolumeSpecName "kube-api-access-lms44". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.569818 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.569853 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.569863 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lms44\" (UniqueName: \"kubernetes.io/projected/361b3420-23c6-4075-ba01-3db1fd10c1d4-kube-api-access-lms44\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.569874 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361b3420-23c6-4075-ba01-3db1fd10c1d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.671266 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" path="/var/lib/kubelet/pods/7a563ba5-1973-4d09-96c3-e0a88d3f6586/volumes" Oct 10 07:08:43 crc kubenswrapper[4732]: I1010 07:08:43.671838 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:43 crc kubenswrapper[4732]: E1010 07:08:43.671995 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 07:08:43 crc kubenswrapper[4732]: E1010 07:08:43.672029 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 07:08:43 crc kubenswrapper[4732]: E1010 07:08:43.672085 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift podName:6ec5be94-f09a-4728-8858-c18fbd9ca2c2 nodeName:}" failed. No retries permitted until 2025-10-10 07:08:45.672062779 +0000 UTC m=+1052.741654030 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift") pod "swift-storage-0" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2") : configmap "swift-ring-files" not found Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.197611 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db70-account-create-lwrb5"] Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.198948 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db70-account-create-lwrb5" Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.201489 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.207394 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db70-account-create-lwrb5"] Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.287163 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/ee98afa3-91b1-4d45-9bf8-e3659b14be63-kube-api-access-lxjfl\") pod \"glance-db70-account-create-lwrb5\" (UID: \"ee98afa3-91b1-4d45-9bf8-e3659b14be63\") " pod="openstack/glance-db70-account-create-lwrb5" Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.388035 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/ee98afa3-91b1-4d45-9bf8-e3659b14be63-kube-api-access-lxjfl\") pod \"glance-db70-account-create-lwrb5\" (UID: \"ee98afa3-91b1-4d45-9bf8-e3659b14be63\") " pod="openstack/glance-db70-account-create-lwrb5" Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.406498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/ee98afa3-91b1-4d45-9bf8-e3659b14be63-kube-api-access-lxjfl\") pod \"glance-db70-account-create-lwrb5\" (UID: \"ee98afa3-91b1-4d45-9bf8-e3659b14be63\") " pod="openstack/glance-db70-account-create-lwrb5" Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.407799 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-82cjr" Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.486284 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-82cjr"] Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.493647 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-82cjr"] Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.525460 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db70-account-create-lwrb5" Oct 10 07:08:44 crc kubenswrapper[4732]: I1010 07:08:44.976771 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db70-account-create-lwrb5"] Oct 10 07:08:45 crc kubenswrapper[4732]: I1010 07:08:45.673094 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361b3420-23c6-4075-ba01-3db1fd10c1d4" path="/var/lib/kubelet/pods/361b3420-23c6-4075-ba01-3db1fd10c1d4/volumes" Oct 10 07:08:45 crc kubenswrapper[4732]: I1010 07:08:45.709478 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:45 crc kubenswrapper[4732]: E1010 07:08:45.710973 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 07:08:45 crc kubenswrapper[4732]: E1010 07:08:45.711002 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 07:08:45 crc kubenswrapper[4732]: E1010 07:08:45.711065 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift podName:6ec5be94-f09a-4728-8858-c18fbd9ca2c2 nodeName:}" failed. No retries permitted until 2025-10-10 07:08:49.711041575 +0000 UTC m=+1056.780632836 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift") pod "swift-storage-0" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2") : configmap "swift-ring-files" not found Oct 10 07:08:46 crc kubenswrapper[4732]: I1010 07:08:46.101557 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-bc45f6dcf-pbr7w" podUID="7a563ba5-1973-4d09-96c3-e0a88d3f6586" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Oct 10 07:08:46 crc kubenswrapper[4732]: I1010 07:08:46.915533 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 10 07:08:47 crc kubenswrapper[4732]: I1010 07:08:47.428998 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db70-account-create-lwrb5" event={"ID":"ee98afa3-91b1-4d45-9bf8-e3659b14be63","Type":"ContainerStarted","Data":"0c3dc5c04d7019b10ef2df774a2829edb174126d7b1beab27007801a2719c27c"} Oct 10 07:08:47 crc kubenswrapper[4732]: I1010 07:08:47.429324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db70-account-create-lwrb5" event={"ID":"ee98afa3-91b1-4d45-9bf8-e3659b14be63","Type":"ContainerStarted","Data":"1f464db9e544abc02fa5a4a261922e73f8e4341b74d4e6a24cde888a9624e69d"} Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.437944 4732 generic.go:334] "Generic (PLEG): container finished" podID="ee98afa3-91b1-4d45-9bf8-e3659b14be63" containerID="0c3dc5c04d7019b10ef2df774a2829edb174126d7b1beab27007801a2719c27c" exitCode=0 Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.438630 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db70-account-create-lwrb5" event={"ID":"ee98afa3-91b1-4d45-9bf8-e3659b14be63","Type":"ContainerDied","Data":"0c3dc5c04d7019b10ef2df774a2829edb174126d7b1beab27007801a2719c27c"} Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.440068 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p5t8l" event={"ID":"898d7ec3-23c2-40a7-b224-cb69ac84e188","Type":"ContainerStarted","Data":"94533af429c5430b6ef2119bcdaab1c9b6bf64102f9f0379d38c2cf1c7403d0d"} Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.471438 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-p5t8l" podStartSLOduration=1.581010101 podStartE2EDuration="6.471422065s" podCreationTimestamp="2025-10-10 07:08:42 +0000 UTC" firstStartedPulling="2025-10-10 07:08:43.173616173 +0000 UTC m=+1050.243207414" lastFinishedPulling="2025-10-10 07:08:48.064028137 +0000 UTC m=+1055.133619378" observedRunningTime="2025-10-10 07:08:48.468345841 +0000 UTC m=+1055.537937102" watchObservedRunningTime="2025-10-10 07:08:48.471422065 +0000 UTC m=+1055.541013316" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.503313 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-653d-account-create-9hqh8"] Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.504308 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-653d-account-create-9hqh8" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.511151 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.512429 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-653d-account-create-9hqh8"] Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.550299 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4rr\" (UniqueName: \"kubernetes.io/projected/b2578031-2533-4d9f-b953-0452e05e88e8-kube-api-access-zj4rr\") pod \"keystone-653d-account-create-9hqh8\" (UID: \"b2578031-2533-4d9f-b953-0452e05e88e8\") " pod="openstack/keystone-653d-account-create-9hqh8" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.652108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj4rr\" (UniqueName: \"kubernetes.io/projected/b2578031-2533-4d9f-b953-0452e05e88e8-kube-api-access-zj4rr\") pod \"keystone-653d-account-create-9hqh8\" (UID: \"b2578031-2533-4d9f-b953-0452e05e88e8\") " pod="openstack/keystone-653d-account-create-9hqh8" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.669245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj4rr\" (UniqueName: \"kubernetes.io/projected/b2578031-2533-4d9f-b953-0452e05e88e8-kube-api-access-zj4rr\") pod \"keystone-653d-account-create-9hqh8\" (UID: \"b2578031-2533-4d9f-b953-0452e05e88e8\") " pod="openstack/keystone-653d-account-create-9hqh8" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.828870 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-653d-account-create-9hqh8" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.905103 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0e64-account-create-tsbdp"] Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.906463 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e64-account-create-tsbdp" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.908707 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.911025 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0e64-account-create-tsbdp"] Oct 10 07:08:48 crc kubenswrapper[4732]: I1010 07:08:48.960162 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c6t8\" (UniqueName: \"kubernetes.io/projected/cf4e373d-3210-42f5-9ec0-506c454718d2-kube-api-access-6c6t8\") pod \"placement-0e64-account-create-tsbdp\" (UID: \"cf4e373d-3210-42f5-9ec0-506c454718d2\") " pod="openstack/placement-0e64-account-create-tsbdp" Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.062903 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c6t8\" (UniqueName: \"kubernetes.io/projected/cf4e373d-3210-42f5-9ec0-506c454718d2-kube-api-access-6c6t8\") pod \"placement-0e64-account-create-tsbdp\" (UID: \"cf4e373d-3210-42f5-9ec0-506c454718d2\") " pod="openstack/placement-0e64-account-create-tsbdp" Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.083509 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c6t8\" (UniqueName: \"kubernetes.io/projected/cf4e373d-3210-42f5-9ec0-506c454718d2-kube-api-access-6c6t8\") pod \"placement-0e64-account-create-tsbdp\" (UID: \"cf4e373d-3210-42f5-9ec0-506c454718d2\") " pod="openstack/placement-0e64-account-create-tsbdp" Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.269664 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e64-account-create-tsbdp" Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.279909 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-653d-account-create-9hqh8"] Oct 10 07:08:49 crc kubenswrapper[4732]: W1010 07:08:49.299774 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2578031_2533_4d9f_b953_0452e05e88e8.slice/crio-5daadc73110239a1cbb078c74fdb5683a319562953e44b351ad1bfce76927634 WatchSource:0}: Error finding container 5daadc73110239a1cbb078c74fdb5683a319562953e44b351ad1bfce76927634: Status 404 returned error can't find the container with id 5daadc73110239a1cbb078c74fdb5683a319562953e44b351ad1bfce76927634 Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.448513 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-653d-account-create-9hqh8" event={"ID":"b2578031-2533-4d9f-b953-0452e05e88e8","Type":"ContainerStarted","Data":"5daadc73110239a1cbb078c74fdb5683a319562953e44b351ad1bfce76927634"} Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.701003 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0e64-account-create-tsbdp"] Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.736956 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db70-account-create-lwrb5" Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.779048 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/ee98afa3-91b1-4d45-9bf8-e3659b14be63-kube-api-access-lxjfl\") pod \"ee98afa3-91b1-4d45-9bf8-e3659b14be63\" (UID: \"ee98afa3-91b1-4d45-9bf8-e3659b14be63\") " Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.779544 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:49 crc kubenswrapper[4732]: E1010 07:08:49.779748 4732 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 10 07:08:49 crc kubenswrapper[4732]: E1010 07:08:49.779769 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 10 07:08:49 crc kubenswrapper[4732]: E1010 07:08:49.779815 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift podName:6ec5be94-f09a-4728-8858-c18fbd9ca2c2 nodeName:}" failed. No retries permitted until 2025-10-10 07:08:57.779801166 +0000 UTC m=+1064.849392397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift") pod "swift-storage-0" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2") : configmap "swift-ring-files" not found Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.787896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee98afa3-91b1-4d45-9bf8-e3659b14be63-kube-api-access-lxjfl" (OuterVolumeSpecName: "kube-api-access-lxjfl") pod "ee98afa3-91b1-4d45-9bf8-e3659b14be63" (UID: "ee98afa3-91b1-4d45-9bf8-e3659b14be63"). InnerVolumeSpecName "kube-api-access-lxjfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:49 crc kubenswrapper[4732]: I1010 07:08:49.880980 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjfl\" (UniqueName: \"kubernetes.io/projected/ee98afa3-91b1-4d45-9bf8-e3659b14be63-kube-api-access-lxjfl\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.456248 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db70-account-create-lwrb5" event={"ID":"ee98afa3-91b1-4d45-9bf8-e3659b14be63","Type":"ContainerDied","Data":"1f464db9e544abc02fa5a4a261922e73f8e4341b74d4e6a24cde888a9624e69d"} Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.456588 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f464db9e544abc02fa5a4a261922e73f8e4341b74d4e6a24cde888a9624e69d" Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.456306 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db70-account-create-lwrb5" Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.458303 4732 generic.go:334] "Generic (PLEG): container finished" podID="b2578031-2533-4d9f-b953-0452e05e88e8" containerID="34891118edba856958b17de5cd936c661fa4707c5c701fe15672ea786340be95" exitCode=0 Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.458374 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-653d-account-create-9hqh8" event={"ID":"b2578031-2533-4d9f-b953-0452e05e88e8","Type":"ContainerDied","Data":"34891118edba856958b17de5cd936c661fa4707c5c701fe15672ea786340be95"} Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.469634 4732 generic.go:334] "Generic (PLEG): container finished" podID="cf4e373d-3210-42f5-9ec0-506c454718d2" containerID="f0961a2f7d699cccb05bc8db81a5efe1cac37f28387231f8e6ffe35fee84e710" exitCode=0 Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.469707 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0e64-account-create-tsbdp" event={"ID":"cf4e373d-3210-42f5-9ec0-506c454718d2","Type":"ContainerDied","Data":"f0961a2f7d699cccb05bc8db81a5efe1cac37f28387231f8e6ffe35fee84e710"} Oct 10 07:08:50 crc kubenswrapper[4732]: I1010 07:08:50.469736 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0e64-account-create-tsbdp" event={"ID":"cf4e373d-3210-42f5-9ec0-506c454718d2","Type":"ContainerStarted","Data":"fa6705f709ab967b82d8a6a9b752ffde1cbf5bf8da0cacd4e24d79881ebd561b"} Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.083915 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.139840 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-bxpbg"] Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.140054 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77597f887-bxpbg" podUID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerName="dnsmasq-dns" containerID="cri-o://dc605f31d3b70858f7a8412c5a325028b1648af0d83ea1155646df08a63f7ce5" gracePeriod=10 Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.484337 4732 generic.go:334] "Generic (PLEG): container finished" podID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerID="dc605f31d3b70858f7a8412c5a325028b1648af0d83ea1155646df08a63f7ce5" exitCode=0 Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.484432 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-bxpbg" event={"ID":"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc","Type":"ContainerDied","Data":"dc605f31d3b70858f7a8412c5a325028b1648af0d83ea1155646df08a63f7ce5"} Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.679164 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.715340 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-dns-svc\") pod \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.715423 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt24t\" (UniqueName: \"kubernetes.io/projected/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-kube-api-access-nt24t\") pod \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.715460 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-config\") pod \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\" (UID: \"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc\") " Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.722636 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-kube-api-access-nt24t" (OuterVolumeSpecName: "kube-api-access-nt24t") pod "104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" (UID: "104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc"). InnerVolumeSpecName "kube-api-access-nt24t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.792964 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" (UID: "104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.813477 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-config" (OuterVolumeSpecName: "config") pod "104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" (UID: "104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.817664 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.817709 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt24t\" (UniqueName: \"kubernetes.io/projected/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-kube-api-access-nt24t\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.817749 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.858289 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e64-account-create-tsbdp" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.889915 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-653d-account-create-9hqh8" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.919146 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj4rr\" (UniqueName: \"kubernetes.io/projected/b2578031-2533-4d9f-b953-0452e05e88e8-kube-api-access-zj4rr\") pod \"b2578031-2533-4d9f-b953-0452e05e88e8\" (UID: \"b2578031-2533-4d9f-b953-0452e05e88e8\") " Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.919216 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c6t8\" (UniqueName: \"kubernetes.io/projected/cf4e373d-3210-42f5-9ec0-506c454718d2-kube-api-access-6c6t8\") pod \"cf4e373d-3210-42f5-9ec0-506c454718d2\" (UID: \"cf4e373d-3210-42f5-9ec0-506c454718d2\") " Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.926331 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf4e373d-3210-42f5-9ec0-506c454718d2-kube-api-access-6c6t8" (OuterVolumeSpecName: "kube-api-access-6c6t8") pod "cf4e373d-3210-42f5-9ec0-506c454718d2" (UID: "cf4e373d-3210-42f5-9ec0-506c454718d2"). InnerVolumeSpecName "kube-api-access-6c6t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:51 crc kubenswrapper[4732]: I1010 07:08:51.926413 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2578031-2533-4d9f-b953-0452e05e88e8-kube-api-access-zj4rr" (OuterVolumeSpecName: "kube-api-access-zj4rr") pod "b2578031-2533-4d9f-b953-0452e05e88e8" (UID: "b2578031-2533-4d9f-b953-0452e05e88e8"). InnerVolumeSpecName "kube-api-access-zj4rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.020735 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj4rr\" (UniqueName: \"kubernetes.io/projected/b2578031-2533-4d9f-b953-0452e05e88e8-kube-api-access-zj4rr\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.020771 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c6t8\" (UniqueName: \"kubernetes.io/projected/cf4e373d-3210-42f5-9ec0-506c454718d2-kube-api-access-6c6t8\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.521521 4732 generic.go:334] "Generic (PLEG): container finished" podID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerID="3ddbabed55e78f709270c00c818e9ba3b1b86ff17c658889d1c920cecadb8ebc" exitCode=0 Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.521563 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"88a11668-5ab6-4b77-8bb7-ac60140f4bd4","Type":"ContainerDied","Data":"3ddbabed55e78f709270c00c818e9ba3b1b86ff17c658889d1c920cecadb8ebc"} Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.525268 4732 generic.go:334] "Generic (PLEG): container finished" podID="565f831c-0da8-4481-8461-8522e0cfa801" containerID="22b06feca3a6572b5d530d56c80f67e3ad45b92fa5b1fd8735418a9965bcc5fe" exitCode=0 Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.525340 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"565f831c-0da8-4481-8461-8522e0cfa801","Type":"ContainerDied","Data":"22b06feca3a6572b5d530d56c80f67e3ad45b92fa5b1fd8735418a9965bcc5fe"} Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.527758 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77597f887-bxpbg" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.527766 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77597f887-bxpbg" event={"ID":"104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc","Type":"ContainerDied","Data":"05ad66c36d43d004aa3e47c911da38e7486adb17e6827ea877333e0ae9a3baf6"} Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.527812 4732 scope.go:117] "RemoveContainer" containerID="dc605f31d3b70858f7a8412c5a325028b1648af0d83ea1155646df08a63f7ce5" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.555932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-653d-account-create-9hqh8" event={"ID":"b2578031-2533-4d9f-b953-0452e05e88e8","Type":"ContainerDied","Data":"5daadc73110239a1cbb078c74fdb5683a319562953e44b351ad1bfce76927634"} Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.555978 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5daadc73110239a1cbb078c74fdb5683a319562953e44b351ad1bfce76927634" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.556102 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-653d-account-create-9hqh8" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.562632 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0e64-account-create-tsbdp" event={"ID":"cf4e373d-3210-42f5-9ec0-506c454718d2","Type":"ContainerDied","Data":"fa6705f709ab967b82d8a6a9b752ffde1cbf5bf8da0cacd4e24d79881ebd561b"} Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.562719 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6705f709ab967b82d8a6a9b752ffde1cbf5bf8da0cacd4e24d79881ebd561b" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.562728 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0e64-account-create-tsbdp" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.696890 4732 scope.go:117] "RemoveContainer" containerID="6eb97669ba18b0e56221859adeb3cc5ccb89ebb3324431e1de64879b0eb29c2c" Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.698516 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77597f887-bxpbg"] Oct 10 07:08:52 crc kubenswrapper[4732]: I1010 07:08:52.708815 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77597f887-bxpbg"] Oct 10 07:08:53 crc kubenswrapper[4732]: I1010 07:08:53.572448 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"88a11668-5ab6-4b77-8bb7-ac60140f4bd4","Type":"ContainerStarted","Data":"84a5b3ebb026e19550cf8d398201da96d41bac755d18b33cc928544d7a6cf2c5"} Oct 10 07:08:53 crc kubenswrapper[4732]: I1010 07:08:53.573859 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:08:53 crc kubenswrapper[4732]: I1010 07:08:53.575824 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"565f831c-0da8-4481-8461-8522e0cfa801","Type":"ContainerStarted","Data":"f7527cba13db589dd756b76500f8bbf94063e072e3428b0e50a4da46b0e63723"} Oct 10 07:08:53 crc kubenswrapper[4732]: I1010 07:08:53.576065 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 10 07:08:53 crc kubenswrapper[4732]: I1010 07:08:53.604437 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=59.604420501 podStartE2EDuration="59.604420501s" podCreationTimestamp="2025-10-10 07:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:08:53.602864749 +0000 UTC m=+1060.672456010" watchObservedRunningTime="2025-10-10 07:08:53.604420501 +0000 UTC m=+1060.674011742" Oct 10 07:08:53 crc kubenswrapper[4732]: I1010 07:08:53.642623 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.416373907 podStartE2EDuration="59.642580099s" podCreationTimestamp="2025-10-10 07:07:54 +0000 UTC" firstStartedPulling="2025-10-10 07:07:59.09062616 +0000 UTC m=+1006.160217401" lastFinishedPulling="2025-10-10 07:08:18.316832352 +0000 UTC m=+1025.386423593" observedRunningTime="2025-10-10 07:08:53.638791115 +0000 UTC m=+1060.708382376" watchObservedRunningTime="2025-10-10 07:08:53.642580099 +0000 UTC m=+1060.712171340" Oct 10 07:08:53 crc kubenswrapper[4732]: I1010 07:08:53.671068 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" path="/var/lib/kubelet/pods/104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc/volumes" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273021 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-nlxqk"] Oct 10 07:08:54 crc kubenswrapper[4732]: E1010 07:08:54.273564 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2578031-2533-4d9f-b953-0452e05e88e8" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273577 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2578031-2533-4d9f-b953-0452e05e88e8" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: E1010 07:08:54.273600 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf4e373d-3210-42f5-9ec0-506c454718d2" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273607 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf4e373d-3210-42f5-9ec0-506c454718d2" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: E1010 07:08:54.273617 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee98afa3-91b1-4d45-9bf8-e3659b14be63" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273623 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee98afa3-91b1-4d45-9bf8-e3659b14be63" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: E1010 07:08:54.273640 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerName="dnsmasq-dns" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273648 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerName="dnsmasq-dns" Oct 10 07:08:54 crc kubenswrapper[4732]: E1010 07:08:54.273659 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerName="init" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273664 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerName="init" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273876 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2578031-2533-4d9f-b953-0452e05e88e8" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273888 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf4e373d-3210-42f5-9ec0-506c454718d2" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273899 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee98afa3-91b1-4d45-9bf8-e3659b14be63" containerName="mariadb-account-create" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.273915 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="104eb8c7-67ff-4fc5-9ac3-d2aca9034ebc" containerName="dnsmasq-dns" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.274516 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.276713 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.276994 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rr4dh" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.281177 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nlxqk"] Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.360840 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-db-sync-config-data\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.360921 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv6jv\" (UniqueName: \"kubernetes.io/projected/ef029637-5b71-415f-8e57-ceec4c813be6-kube-api-access-vv6jv\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.361251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-config-data\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.361477 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-combined-ca-bundle\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.462720 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv6jv\" (UniqueName: \"kubernetes.io/projected/ef029637-5b71-415f-8e57-ceec4c813be6-kube-api-access-vv6jv\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.462849 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-config-data\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.462914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-combined-ca-bundle\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.462955 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-db-sync-config-data\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.467929 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-db-sync-config-data\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.474317 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-combined-ca-bundle\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.477758 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lzkzk" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" probeResult="failure" output=< Oct 10 07:08:54 crc kubenswrapper[4732]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 10 07:08:54 crc kubenswrapper[4732]: > Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.482405 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv6jv\" (UniqueName: \"kubernetes.io/projected/ef029637-5b71-415f-8e57-ceec4c813be6-kube-api-access-vv6jv\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.485487 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-config-data\") pod \"glance-db-sync-nlxqk\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:54 crc kubenswrapper[4732]: I1010 07:08:54.596168 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nlxqk" Oct 10 07:08:55 crc kubenswrapper[4732]: I1010 07:08:55.229764 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nlxqk"] Oct 10 07:08:55 crc kubenswrapper[4732]: I1010 07:08:55.356101 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:08:55 crc kubenswrapper[4732]: I1010 07:08:55.356175 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:08:55 crc kubenswrapper[4732]: I1010 07:08:55.590568 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nlxqk" event={"ID":"ef029637-5b71-415f-8e57-ceec4c813be6","Type":"ContainerStarted","Data":"6ac2151ec3e996b6bd3382c2cad6180dbbcb229866eed68698e198d4542c1882"} Oct 10 07:08:55 crc kubenswrapper[4732]: I1010 07:08:55.593005 4732 generic.go:334] "Generic (PLEG): container finished" podID="898d7ec3-23c2-40a7-b224-cb69ac84e188" containerID="94533af429c5430b6ef2119bcdaab1c9b6bf64102f9f0379d38c2cf1c7403d0d" exitCode=0 Oct 10 07:08:55 crc kubenswrapper[4732]: I1010 07:08:55.593064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p5t8l" event={"ID":"898d7ec3-23c2-40a7-b224-cb69ac84e188","Type":"ContainerDied","Data":"94533af429c5430b6ef2119bcdaab1c9b6bf64102f9f0379d38c2cf1c7403d0d"} Oct 10 07:08:56 crc kubenswrapper[4732]: I1010 07:08:56.947942 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.107112 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-swiftconf\") pod \"898d7ec3-23c2-40a7-b224-cb69ac84e188\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.107157 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-dispersionconf\") pod \"898d7ec3-23c2-40a7-b224-cb69ac84e188\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.107214 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-combined-ca-bundle\") pod \"898d7ec3-23c2-40a7-b224-cb69ac84e188\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.107271 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8lz7\" (UniqueName: \"kubernetes.io/projected/898d7ec3-23c2-40a7-b224-cb69ac84e188-kube-api-access-f8lz7\") pod \"898d7ec3-23c2-40a7-b224-cb69ac84e188\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.107294 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/898d7ec3-23c2-40a7-b224-cb69ac84e188-etc-swift\") pod \"898d7ec3-23c2-40a7-b224-cb69ac84e188\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.107462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-scripts\") pod \"898d7ec3-23c2-40a7-b224-cb69ac84e188\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.107486 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-ring-data-devices\") pod \"898d7ec3-23c2-40a7-b224-cb69ac84e188\" (UID: \"898d7ec3-23c2-40a7-b224-cb69ac84e188\") " Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.108406 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "898d7ec3-23c2-40a7-b224-cb69ac84e188" (UID: "898d7ec3-23c2-40a7-b224-cb69ac84e188"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.108589 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/898d7ec3-23c2-40a7-b224-cb69ac84e188-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "898d7ec3-23c2-40a7-b224-cb69ac84e188" (UID: "898d7ec3-23c2-40a7-b224-cb69ac84e188"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.118122 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "898d7ec3-23c2-40a7-b224-cb69ac84e188" (UID: "898d7ec3-23c2-40a7-b224-cb69ac84e188"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.129137 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-scripts" (OuterVolumeSpecName: "scripts") pod "898d7ec3-23c2-40a7-b224-cb69ac84e188" (UID: "898d7ec3-23c2-40a7-b224-cb69ac84e188"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.130669 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/898d7ec3-23c2-40a7-b224-cb69ac84e188-kube-api-access-f8lz7" (OuterVolumeSpecName: "kube-api-access-f8lz7") pod "898d7ec3-23c2-40a7-b224-cb69ac84e188" (UID: "898d7ec3-23c2-40a7-b224-cb69ac84e188"). InnerVolumeSpecName "kube-api-access-f8lz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.138164 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "898d7ec3-23c2-40a7-b224-cb69ac84e188" (UID: "898d7ec3-23c2-40a7-b224-cb69ac84e188"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.142740 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "898d7ec3-23c2-40a7-b224-cb69ac84e188" (UID: "898d7ec3-23c2-40a7-b224-cb69ac84e188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.208752 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8lz7\" (UniqueName: \"kubernetes.io/projected/898d7ec3-23c2-40a7-b224-cb69ac84e188-kube-api-access-f8lz7\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.208982 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/898d7ec3-23c2-40a7-b224-cb69ac84e188-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.209068 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.209130 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/898d7ec3-23c2-40a7-b224-cb69ac84e188-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.209183 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.209241 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.209302 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/898d7ec3-23c2-40a7-b224-cb69ac84e188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.609119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-p5t8l" event={"ID":"898d7ec3-23c2-40a7-b224-cb69ac84e188","Type":"ContainerDied","Data":"51c0f2e4b455eab47c0fa8a02670fdcb6b35611bc60bc745a0ab4fd94b9b81e6"} Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.609162 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c0f2e4b455eab47c0fa8a02670fdcb6b35611bc60bc745a0ab4fd94b9b81e6" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.609186 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p5t8l" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.818121 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.824112 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"swift-storage-0\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " pod="openstack/swift-storage-0" Oct 10 07:08:57 crc kubenswrapper[4732]: I1010 07:08:57.882758 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 07:08:58 crc kubenswrapper[4732]: I1010 07:08:58.354530 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 10 07:08:58 crc kubenswrapper[4732]: W1010 07:08:58.366857 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ec5be94_f09a_4728_8858_c18fbd9ca2c2.slice/crio-5c30c7cb3461534a3149c9cdc991691349664994d1ba24bf0904fe56a2f748d8 WatchSource:0}: Error finding container 5c30c7cb3461534a3149c9cdc991691349664994d1ba24bf0904fe56a2f748d8: Status 404 returned error can't find the container with id 5c30c7cb3461534a3149c9cdc991691349664994d1ba24bf0904fe56a2f748d8 Oct 10 07:08:58 crc kubenswrapper[4732]: I1010 07:08:58.620177 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"5c30c7cb3461534a3149c9cdc991691349664994d1ba24bf0904fe56a2f748d8"} Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.490802 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lzkzk" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" probeResult="failure" output=< Oct 10 07:08:59 crc kubenswrapper[4732]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 10 07:08:59 crc kubenswrapper[4732]: > Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.507763 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.517771 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.776960 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lzkzk-config-hrcdd"] Oct 10 07:08:59 crc kubenswrapper[4732]: E1010 07:08:59.778110 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="898d7ec3-23c2-40a7-b224-cb69ac84e188" containerName="swift-ring-rebalance" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.778144 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="898d7ec3-23c2-40a7-b224-cb69ac84e188" containerName="swift-ring-rebalance" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.778572 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="898d7ec3-23c2-40a7-b224-cb69ac84e188" containerName="swift-ring-rebalance" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.779594 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.782976 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.799600 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzkzk-config-hrcdd"] Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.854953 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.855039 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-scripts\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.855085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-additional-scripts\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.855166 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run-ovn\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.855194 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-log-ovn\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.855246 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rdxx\" (UniqueName: \"kubernetes.io/projected/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-kube-api-access-4rdxx\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.956729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.956793 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-scripts\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.956823 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-additional-scripts\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.956871 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run-ovn\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.956891 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-log-ovn\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.956920 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rdxx\" (UniqueName: \"kubernetes.io/projected/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-kube-api-access-4rdxx\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.957126 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.957382 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run-ovn\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.957433 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-log-ovn\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.958307 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-additional-scripts\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.960468 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-scripts\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:08:59 crc kubenswrapper[4732]: I1010 07:08:59.978035 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rdxx\" (UniqueName: \"kubernetes.io/projected/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-kube-api-access-4rdxx\") pod \"ovn-controller-lzkzk-config-hrcdd\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:09:00 crc kubenswrapper[4732]: I1010 07:09:00.114002 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:09:00 crc kubenswrapper[4732]: I1010 07:09:00.619901 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzkzk-config-hrcdd"] Oct 10 07:09:04 crc kubenswrapper[4732]: I1010 07:09:04.477297 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lzkzk" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" probeResult="failure" output=< Oct 10 07:09:04 crc kubenswrapper[4732]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 10 07:09:04 crc kubenswrapper[4732]: > Oct 10 07:09:05 crc kubenswrapper[4732]: I1010 07:09:05.553928 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 10 07:09:05 crc kubenswrapper[4732]: I1010 07:09:05.898083 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-v7wvk"] Oct 10 07:09:05 crc kubenswrapper[4732]: I1010 07:09:05.899306 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v7wvk" Oct 10 07:09:05 crc kubenswrapper[4732]: I1010 07:09:05.908050 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v7wvk"] Oct 10 07:09:05 crc kubenswrapper[4732]: I1010 07:09:05.953301 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4sl2\" (UniqueName: \"kubernetes.io/projected/eec93206-5a99-4edd-a303-0d8dee1658dc-kube-api-access-w4sl2\") pod \"barbican-db-create-v7wvk\" (UID: \"eec93206-5a99-4edd-a303-0d8dee1658dc\") " pod="openstack/barbican-db-create-v7wvk" Oct 10 07:09:05 crc kubenswrapper[4732]: I1010 07:09:05.953895 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:09:05 crc kubenswrapper[4732]: I1010 07:09:05.999392 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ng442"] Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.000584 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ng442" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.026386 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ng442"] Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.055819 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6prdb\" (UniqueName: \"kubernetes.io/projected/14fd1193-0d3e-4629-86df-7cb03f4b9b33-kube-api-access-6prdb\") pod \"cinder-db-create-ng442\" (UID: \"14fd1193-0d3e-4629-86df-7cb03f4b9b33\") " pod="openstack/cinder-db-create-ng442" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.056235 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4sl2\" (UniqueName: \"kubernetes.io/projected/eec93206-5a99-4edd-a303-0d8dee1658dc-kube-api-access-w4sl2\") pod \"barbican-db-create-v7wvk\" (UID: \"eec93206-5a99-4edd-a303-0d8dee1658dc\") " pod="openstack/barbican-db-create-v7wvk" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.075245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4sl2\" (UniqueName: \"kubernetes.io/projected/eec93206-5a99-4edd-a303-0d8dee1658dc-kube-api-access-w4sl2\") pod \"barbican-db-create-v7wvk\" (UID: \"eec93206-5a99-4edd-a303-0d8dee1658dc\") " pod="openstack/barbican-db-create-v7wvk" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.157758 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6prdb\" (UniqueName: \"kubernetes.io/projected/14fd1193-0d3e-4629-86df-7cb03f4b9b33-kube-api-access-6prdb\") pod \"cinder-db-create-ng442\" (UID: \"14fd1193-0d3e-4629-86df-7cb03f4b9b33\") " pod="openstack/cinder-db-create-ng442" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.193399 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-p2tz8"] Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.194402 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.196646 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6prdb\" (UniqueName: \"kubernetes.io/projected/14fd1193-0d3e-4629-86df-7cb03f4b9b33-kube-api-access-6prdb\") pod \"cinder-db-create-ng442\" (UID: \"14fd1193-0d3e-4629-86df-7cb03f4b9b33\") " pod="openstack/cinder-db-create-ng442" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.197185 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.197354 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8gnjq" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.197494 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.207998 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.210838 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p2tz8"] Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.230243 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v7wvk" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.259682 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-config-data\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.259762 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-combined-ca-bundle\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.259903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjxd9\" (UniqueName: \"kubernetes.io/projected/801dc083-4a38-4af1-9bf1-b40a3c204e09-kube-api-access-vjxd9\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.282241 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-mvmsb"] Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.283525 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvmsb" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.297451 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mvmsb"] Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.319702 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ng442" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.361936 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjxd9\" (UniqueName: \"kubernetes.io/projected/801dc083-4a38-4af1-9bf1-b40a3c204e09-kube-api-access-vjxd9\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.362005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-config-data\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.362030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-combined-ca-bundle\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.362060 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc7m5\" (UniqueName: \"kubernetes.io/projected/e9928b55-49e9-4091-95de-77a8c1a01318-kube-api-access-mc7m5\") pod \"neutron-db-create-mvmsb\" (UID: \"e9928b55-49e9-4091-95de-77a8c1a01318\") " pod="openstack/neutron-db-create-mvmsb" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.366323 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-combined-ca-bundle\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.366445 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-config-data\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.380312 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjxd9\" (UniqueName: \"kubernetes.io/projected/801dc083-4a38-4af1-9bf1-b40a3c204e09-kube-api-access-vjxd9\") pod \"keystone-db-sync-p2tz8\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.463341 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc7m5\" (UniqueName: \"kubernetes.io/projected/e9928b55-49e9-4091-95de-77a8c1a01318-kube-api-access-mc7m5\") pod \"neutron-db-create-mvmsb\" (UID: \"e9928b55-49e9-4091-95de-77a8c1a01318\") " pod="openstack/neutron-db-create-mvmsb" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.479543 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc7m5\" (UniqueName: \"kubernetes.io/projected/e9928b55-49e9-4091-95de-77a8c1a01318-kube-api-access-mc7m5\") pod \"neutron-db-create-mvmsb\" (UID: \"e9928b55-49e9-4091-95de-77a8c1a01318\") " pod="openstack/neutron-db-create-mvmsb" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.539725 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:06 crc kubenswrapper[4732]: I1010 07:09:06.599370 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvmsb" Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.262565 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-v7wvk"] Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.546609 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p2tz8"] Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.622386 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ng442"] Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.624659 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-mvmsb"] Oct 10 07:09:07 crc kubenswrapper[4732]: W1010 07:09:07.628063 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9928b55_49e9_4091_95de_77a8c1a01318.slice/crio-1ee181515dc413fb852c3289a22b37f3767d8b4453c64f5751e0aa558faaac97 WatchSource:0}: Error finding container 1ee181515dc413fb852c3289a22b37f3767d8b4453c64f5751e0aa558faaac97: Status 404 returned error can't find the container with id 1ee181515dc413fb852c3289a22b37f3767d8b4453c64f5751e0aa558faaac97 Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.705109 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk-config-hrcdd" event={"ID":"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2","Type":"ContainerStarted","Data":"c5af1d769dd390d1c03be51f1c6e582e04b0639cabbea6907be5cbb11d6351ba"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.705161 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk-config-hrcdd" event={"ID":"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2","Type":"ContainerStarted","Data":"d5d3105eaaa0b3ca3b13ad867d683e75d2bafbdc1024aa7cccc18191731e2477"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.715211 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"440271d54f3b94f368b668f0086f762ecb8f963317d10585e119ad50bf50d796"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.715276 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"aea23c87d7a0e1648589bbcd40543c7d5e8ccf5a80b3a896677fc3b317ec2dda"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.717723 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2tz8" event={"ID":"801dc083-4a38-4af1-9bf1-b40a3c204e09","Type":"ContainerStarted","Data":"a0ce100642fe0104a42da1ffbd6b285d61a65715234afd03c74bd09eb18b72f1"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.720375 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mvmsb" event={"ID":"e9928b55-49e9-4091-95de-77a8c1a01318","Type":"ContainerStarted","Data":"1ee181515dc413fb852c3289a22b37f3767d8b4453c64f5751e0aa558faaac97"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.725457 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lzkzk-config-hrcdd" podStartSLOduration=8.725440444 podStartE2EDuration="8.725440444s" podCreationTimestamp="2025-10-10 07:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:07.723362638 +0000 UTC m=+1074.792953889" watchObservedRunningTime="2025-10-10 07:09:07.725440444 +0000 UTC m=+1074.795031685" Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.730996 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ng442" event={"ID":"14fd1193-0d3e-4629-86df-7cb03f4b9b33","Type":"ContainerStarted","Data":"862ba162a36cac53fb488f4d9310ec510968e19776f3a2d70b763da47e98c4e3"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.737737 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v7wvk" event={"ID":"eec93206-5a99-4edd-a303-0d8dee1658dc","Type":"ContainerStarted","Data":"5f06c4bc79011bbd7290538dc030eb8dbdd3cf2ed05e58e3a6600d7565d73a5b"} Oct 10 07:09:07 crc kubenswrapper[4732]: I1010 07:09:07.775141 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-v7wvk" podStartSLOduration=2.7751170160000003 podStartE2EDuration="2.775117016s" podCreationTimestamp="2025-10-10 07:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:07.757022084 +0000 UTC m=+1074.826613345" watchObservedRunningTime="2025-10-10 07:09:07.775117016 +0000 UTC m=+1074.844708267" Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.748788 4732 generic.go:334] "Generic (PLEG): container finished" podID="eec93206-5a99-4edd-a303-0d8dee1658dc" containerID="c5860bbb5f8be0551d48d61d130a9c503672d099b5b9b81ad7f7930e2d1a74c7" exitCode=0 Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.748843 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v7wvk" event={"ID":"eec93206-5a99-4edd-a303-0d8dee1658dc","Type":"ContainerDied","Data":"c5860bbb5f8be0551d48d61d130a9c503672d099b5b9b81ad7f7930e2d1a74c7"} Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.753754 4732 generic.go:334] "Generic (PLEG): container finished" podID="ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" containerID="c5af1d769dd390d1c03be51f1c6e582e04b0639cabbea6907be5cbb11d6351ba" exitCode=0 Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.753820 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk-config-hrcdd" event={"ID":"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2","Type":"ContainerDied","Data":"c5af1d769dd390d1c03be51f1c6e582e04b0639cabbea6907be5cbb11d6351ba"} Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.759748 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"306e993d7c42ab68be7b6186fb51f97b059b5a7bcc1a130f1b6cecbe5bae570f"} Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.759795 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"04a8993decfa9c602d76b910a4eb75f9a4b7db875ca6bfa209f16244327dbd1a"} Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.762021 4732 generic.go:334] "Generic (PLEG): container finished" podID="e9928b55-49e9-4091-95de-77a8c1a01318" containerID="65ed626b6b047272043b5c3de8ee0323c467e7f133e08e9a6d0b893eee050ead" exitCode=0 Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.762124 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mvmsb" event={"ID":"e9928b55-49e9-4091-95de-77a8c1a01318","Type":"ContainerDied","Data":"65ed626b6b047272043b5c3de8ee0323c467e7f133e08e9a6d0b893eee050ead"} Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.765514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nlxqk" event={"ID":"ef029637-5b71-415f-8e57-ceec4c813be6","Type":"ContainerStarted","Data":"9313d24a807329b7bc908127acb7ed66fd3b29e3013536c8569a4d247c4da533"} Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.783150 4732 generic.go:334] "Generic (PLEG): container finished" podID="14fd1193-0d3e-4629-86df-7cb03f4b9b33" containerID="07d7e976e6a44abd2a9a09997eb9283af5abbbe0a5f237f7567c1ac40c5bfedb" exitCode=0 Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.783221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ng442" event={"ID":"14fd1193-0d3e-4629-86df-7cb03f4b9b33","Type":"ContainerDied","Data":"07d7e976e6a44abd2a9a09997eb9283af5abbbe0a5f237f7567c1ac40c5bfedb"} Oct 10 07:09:08 crc kubenswrapper[4732]: I1010 07:09:08.796523 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-nlxqk" podStartSLOduration=3.123727625 podStartE2EDuration="14.796505635s" podCreationTimestamp="2025-10-10 07:08:54 +0000 UTC" firstStartedPulling="2025-10-10 07:08:55.239453321 +0000 UTC m=+1062.309044552" lastFinishedPulling="2025-10-10 07:09:06.912231321 +0000 UTC m=+1073.981822562" observedRunningTime="2025-10-10 07:09:08.790764209 +0000 UTC m=+1075.860355460" watchObservedRunningTime="2025-10-10 07:09:08.796505635 +0000 UTC m=+1075.866096876" Oct 10 07:09:09 crc kubenswrapper[4732]: I1010 07:09:09.481492 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lzkzk" Oct 10 07:09:10 crc kubenswrapper[4732]: I1010 07:09:10.152241 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvmsb" Oct 10 07:09:10 crc kubenswrapper[4732]: I1010 07:09:10.241854 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc7m5\" (UniqueName: \"kubernetes.io/projected/e9928b55-49e9-4091-95de-77a8c1a01318-kube-api-access-mc7m5\") pod \"e9928b55-49e9-4091-95de-77a8c1a01318\" (UID: \"e9928b55-49e9-4091-95de-77a8c1a01318\") " Oct 10 07:09:10 crc kubenswrapper[4732]: I1010 07:09:10.252082 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9928b55-49e9-4091-95de-77a8c1a01318-kube-api-access-mc7m5" (OuterVolumeSpecName: "kube-api-access-mc7m5") pod "e9928b55-49e9-4091-95de-77a8c1a01318" (UID: "e9928b55-49e9-4091-95de-77a8c1a01318"). InnerVolumeSpecName "kube-api-access-mc7m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:10 crc kubenswrapper[4732]: I1010 07:09:10.343842 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc7m5\" (UniqueName: \"kubernetes.io/projected/e9928b55-49e9-4091-95de-77a8c1a01318-kube-api-access-mc7m5\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:10 crc kubenswrapper[4732]: I1010 07:09:10.801584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-mvmsb" event={"ID":"e9928b55-49e9-4091-95de-77a8c1a01318","Type":"ContainerDied","Data":"1ee181515dc413fb852c3289a22b37f3767d8b4453c64f5751e0aa558faaac97"} Oct 10 07:09:10 crc kubenswrapper[4732]: I1010 07:09:10.801623 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-mvmsb" Oct 10 07:09:10 crc kubenswrapper[4732]: I1010 07:09:10.801625 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ee181515dc413fb852c3289a22b37f3767d8b4453c64f5751e0aa558faaac97" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.210192 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.234158 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ng442" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.241024 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v7wvk" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.290726 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-scripts\") pod \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.290768 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run\") pod \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.290814 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-log-ovn\") pod \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.290839 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rdxx\" (UniqueName: \"kubernetes.io/projected/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-kube-api-access-4rdxx\") pod \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.290862 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run-ovn\") pod \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.290881 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-additional-scripts\") pod \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\" (UID: \"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291176 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" (UID: "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291178 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run" (OuterVolumeSpecName: "var-run") pod "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" (UID: "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291225 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" (UID: "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291575 4732 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291609 4732 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291647 4732 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-var-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291771 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" (UID: "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.291922 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-scripts" (OuterVolumeSpecName: "scripts") pod "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" (UID: "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.295932 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-kube-api-access-4rdxx" (OuterVolumeSpecName: "kube-api-access-4rdxx") pod "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" (UID: "ebfe1bdc-f1b1-4480-9421-e25d777d1ce2"). InnerVolumeSpecName "kube-api-access-4rdxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.392517 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6prdb\" (UniqueName: \"kubernetes.io/projected/14fd1193-0d3e-4629-86df-7cb03f4b9b33-kube-api-access-6prdb\") pod \"14fd1193-0d3e-4629-86df-7cb03f4b9b33\" (UID: \"14fd1193-0d3e-4629-86df-7cb03f4b9b33\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.392584 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4sl2\" (UniqueName: \"kubernetes.io/projected/eec93206-5a99-4edd-a303-0d8dee1658dc-kube-api-access-w4sl2\") pod \"eec93206-5a99-4edd-a303-0d8dee1658dc\" (UID: \"eec93206-5a99-4edd-a303-0d8dee1658dc\") " Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.393041 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.393063 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rdxx\" (UniqueName: \"kubernetes.io/projected/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-kube-api-access-4rdxx\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.393076 4732 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.395536 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14fd1193-0d3e-4629-86df-7cb03f4b9b33-kube-api-access-6prdb" (OuterVolumeSpecName: "kube-api-access-6prdb") pod "14fd1193-0d3e-4629-86df-7cb03f4b9b33" (UID: "14fd1193-0d3e-4629-86df-7cb03f4b9b33"). InnerVolumeSpecName "kube-api-access-6prdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.396018 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eec93206-5a99-4edd-a303-0d8dee1658dc-kube-api-access-w4sl2" (OuterVolumeSpecName: "kube-api-access-w4sl2") pod "eec93206-5a99-4edd-a303-0d8dee1658dc" (UID: "eec93206-5a99-4edd-a303-0d8dee1658dc"). InnerVolumeSpecName "kube-api-access-w4sl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.494764 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6prdb\" (UniqueName: \"kubernetes.io/projected/14fd1193-0d3e-4629-86df-7cb03f4b9b33-kube-api-access-6prdb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.494797 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4sl2\" (UniqueName: \"kubernetes.io/projected/eec93206-5a99-4edd-a303-0d8dee1658dc-kube-api-access-w4sl2\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.824681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ng442" event={"ID":"14fd1193-0d3e-4629-86df-7cb03f4b9b33","Type":"ContainerDied","Data":"862ba162a36cac53fb488f4d9310ec510968e19776f3a2d70b763da47e98c4e3"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.825013 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="862ba162a36cac53fb488f4d9310ec510968e19776f3a2d70b763da47e98c4e3" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.825303 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ng442" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.826520 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-v7wvk" event={"ID":"eec93206-5a99-4edd-a303-0d8dee1658dc","Type":"ContainerDied","Data":"5f06c4bc79011bbd7290538dc030eb8dbdd3cf2ed05e58e3a6600d7565d73a5b"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.826570 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f06c4bc79011bbd7290538dc030eb8dbdd3cf2ed05e58e3a6600d7565d73a5b" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.826615 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-v7wvk" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.829462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk-config-hrcdd" event={"ID":"ebfe1bdc-f1b1-4480-9421-e25d777d1ce2","Type":"ContainerDied","Data":"d5d3105eaaa0b3ca3b13ad867d683e75d2bafbdc1024aa7cccc18191731e2477"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.829504 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5d3105eaaa0b3ca3b13ad867d683e75d2bafbdc1024aa7cccc18191731e2477" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.829573 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk-config-hrcdd" Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.835390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"43a213f53856bce5c190f44e7458e042262da79e1784f2045dda4e75dc3471b6"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.835495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"bcadb525584dba5a9a1af302bfd2be19ff703a8c744b55dbf166f43746dfd5fa"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.835528 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"ca80afd8ea95c25e8f07db4e28d154c1d53a72ce3f36789ae3ff9af29cf3a561"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.835540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"4577435a75c7166b15559759271a9948adb5a88482a2db26d6c48d48b9208d39"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.838172 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2tz8" event={"ID":"801dc083-4a38-4af1-9bf1-b40a3c204e09","Type":"ContainerStarted","Data":"0e4ff507cdac3d345c7599b2f38569e72cbb904578805d9196ce57b351e8fc3a"} Oct 10 07:09:13 crc kubenswrapper[4732]: I1010 07:09:13.870366 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-p2tz8" podStartSLOduration=2.424216324 podStartE2EDuration="7.870346641s" podCreationTimestamp="2025-10-10 07:09:06 +0000 UTC" firstStartedPulling="2025-10-10 07:09:07.5665434 +0000 UTC m=+1074.636134641" lastFinishedPulling="2025-10-10 07:09:13.012673717 +0000 UTC m=+1080.082264958" observedRunningTime="2025-10-10 07:09:13.86517568 +0000 UTC m=+1080.934766921" watchObservedRunningTime="2025-10-10 07:09:13.870346641 +0000 UTC m=+1080.939937892" Oct 10 07:09:14 crc kubenswrapper[4732]: I1010 07:09:14.317675 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lzkzk-config-hrcdd"] Oct 10 07:09:14 crc kubenswrapper[4732]: I1010 07:09:14.324857 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lzkzk-config-hrcdd"] Oct 10 07:09:14 crc kubenswrapper[4732]: I1010 07:09:14.884262 4732 generic.go:334] "Generic (PLEG): container finished" podID="ef029637-5b71-415f-8e57-ceec4c813be6" containerID="9313d24a807329b7bc908127acb7ed66fd3b29e3013536c8569a4d247c4da533" exitCode=0 Oct 10 07:09:14 crc kubenswrapper[4732]: I1010 07:09:14.884347 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nlxqk" event={"ID":"ef029637-5b71-415f-8e57-ceec4c813be6","Type":"ContainerDied","Data":"9313d24a807329b7bc908127acb7ed66fd3b29e3013536c8569a4d247c4da533"} Oct 10 07:09:15 crc kubenswrapper[4732]: I1010 07:09:15.671359 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" path="/var/lib/kubelet/pods/ebfe1bdc-f1b1-4480-9421-e25d777d1ce2/volumes" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.120816 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e3b5-account-create-fcdjv"] Oct 10 07:09:16 crc kubenswrapper[4732]: E1010 07:09:16.121287 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eec93206-5a99-4edd-a303-0d8dee1658dc" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121301 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eec93206-5a99-4edd-a303-0d8dee1658dc" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: E1010 07:09:16.121330 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14fd1193-0d3e-4629-86df-7cb03f4b9b33" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121339 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="14fd1193-0d3e-4629-86df-7cb03f4b9b33" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: E1010 07:09:16.121351 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" containerName="ovn-config" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121358 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" containerName="ovn-config" Oct 10 07:09:16 crc kubenswrapper[4732]: E1010 07:09:16.121400 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9928b55-49e9-4091-95de-77a8c1a01318" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121409 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9928b55-49e9-4091-95de-77a8c1a01318" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121631 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9928b55-49e9-4091-95de-77a8c1a01318" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121682 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfe1bdc-f1b1-4480-9421-e25d777d1ce2" containerName="ovn-config" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121729 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eec93206-5a99-4edd-a303-0d8dee1658dc" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.121749 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="14fd1193-0d3e-4629-86df-7cb03f4b9b33" containerName="mariadb-database-create" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.122428 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3b5-account-create-fcdjv" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.124783 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 10 07:09:16 crc kubenswrapper[4732]: I1010 07:09:16.141497 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e3b5-account-create-fcdjv"] Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:16.250289 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fptnq\" (UniqueName: \"kubernetes.io/projected/2e5a89ea-9d74-48e6-8255-62ebd3feaa52-kube-api-access-fptnq\") pod \"neutron-e3b5-account-create-fcdjv\" (UID: \"2e5a89ea-9d74-48e6-8255-62ebd3feaa52\") " pod="openstack/neutron-e3b5-account-create-fcdjv" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:16.352469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fptnq\" (UniqueName: \"kubernetes.io/projected/2e5a89ea-9d74-48e6-8255-62ebd3feaa52-kube-api-access-fptnq\") pod \"neutron-e3b5-account-create-fcdjv\" (UID: \"2e5a89ea-9d74-48e6-8255-62ebd3feaa52\") " pod="openstack/neutron-e3b5-account-create-fcdjv" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:16.374612 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fptnq\" (UniqueName: \"kubernetes.io/projected/2e5a89ea-9d74-48e6-8255-62ebd3feaa52-kube-api-access-fptnq\") pod \"neutron-e3b5-account-create-fcdjv\" (UID: \"2e5a89ea-9d74-48e6-8255-62ebd3feaa52\") " pod="openstack/neutron-e3b5-account-create-fcdjv" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:16.445714 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3b5-account-create-fcdjv" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:19.891064 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nlxqk" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:19.923776 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nlxqk" event={"ID":"ef029637-5b71-415f-8e57-ceec4c813be6","Type":"ContainerDied","Data":"6ac2151ec3e996b6bd3382c2cad6180dbbcb229866eed68698e198d4542c1882"} Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:19.923809 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nlxqk" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:19.923818 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ac2151ec3e996b6bd3382c2cad6180dbbcb229866eed68698e198d4542c1882" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.018557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-db-sync-config-data\") pod \"ef029637-5b71-415f-8e57-ceec4c813be6\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.018662 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-config-data\") pod \"ef029637-5b71-415f-8e57-ceec4c813be6\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.018882 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv6jv\" (UniqueName: \"kubernetes.io/projected/ef029637-5b71-415f-8e57-ceec4c813be6-kube-api-access-vv6jv\") pod \"ef029637-5b71-415f-8e57-ceec4c813be6\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.018928 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-combined-ca-bundle\") pod \"ef029637-5b71-415f-8e57-ceec4c813be6\" (UID: \"ef029637-5b71-415f-8e57-ceec4c813be6\") " Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.024946 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ef029637-5b71-415f-8e57-ceec4c813be6" (UID: "ef029637-5b71-415f-8e57-ceec4c813be6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.025947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef029637-5b71-415f-8e57-ceec4c813be6-kube-api-access-vv6jv" (OuterVolumeSpecName: "kube-api-access-vv6jv") pod "ef029637-5b71-415f-8e57-ceec4c813be6" (UID: "ef029637-5b71-415f-8e57-ceec4c813be6"). InnerVolumeSpecName "kube-api-access-vv6jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.047006 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef029637-5b71-415f-8e57-ceec4c813be6" (UID: "ef029637-5b71-415f-8e57-ceec4c813be6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.067044 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-config-data" (OuterVolumeSpecName: "config-data") pod "ef029637-5b71-415f-8e57-ceec4c813be6" (UID: "ef029637-5b71-415f-8e57-ceec4c813be6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.120944 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.120971 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv6jv\" (UniqueName: \"kubernetes.io/projected/ef029637-5b71-415f-8e57-ceec4c813be6-kube-api-access-vv6jv\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.120981 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:20.120989 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ef029637-5b71-415f-8e57-ceec4c813be6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.254131 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78b7fd5b57-xttsm"] Oct 10 07:09:23 crc kubenswrapper[4732]: E1010 07:09:21.254840 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef029637-5b71-415f-8e57-ceec4c813be6" containerName="glance-db-sync" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.254858 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef029637-5b71-415f-8e57-ceec4c813be6" containerName="glance-db-sync" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.255060 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef029637-5b71-415f-8e57-ceec4c813be6" containerName="glance-db-sync" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.256253 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.275808 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b7fd5b57-xttsm"] Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.439203 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-nb\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.439260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v9j7\" (UniqueName: \"kubernetes.io/projected/04dced02-6453-4b70-8eba-8d15648e1177-kube-api-access-6v9j7\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.439285 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-dns-svc\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.439307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-sb\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.439518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-config\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.540751 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-config\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.540872 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-nb\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.540910 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v9j7\" (UniqueName: \"kubernetes.io/projected/04dced02-6453-4b70-8eba-8d15648e1177-kube-api-access-6v9j7\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.540937 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-dns-svc\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.540959 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-sb\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.542083 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-config\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.542083 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-nb\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.542311 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-sb\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.542500 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-dns-svc\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.569235 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v9j7\" (UniqueName: \"kubernetes.io/projected/04dced02-6453-4b70-8eba-8d15648e1177-kube-api-access-6v9j7\") pod \"dnsmasq-dns-78b7fd5b57-xttsm\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:21.576439 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.560877 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e3b5-account-create-fcdjv"] Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.614427 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78b7fd5b57-xttsm"] Oct 10 07:09:23 crc kubenswrapper[4732]: W1010 07:09:23.622158 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04dced02_6453_4b70_8eba_8d15648e1177.slice/crio-9c29fe100e1febdd343392e667a3891265a08e3b3c1c1c8c88182d7204560c16 WatchSource:0}: Error finding container 9c29fe100e1febdd343392e667a3891265a08e3b3c1c1c8c88182d7204560c16: Status 404 returned error can't find the container with id 9c29fe100e1febdd343392e667a3891265a08e3b3c1c1c8c88182d7204560c16 Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.976428 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"e2917273d26b808e5a8fc08c8152f588e5014472d4e7a647ebdbecedcba84fda"} Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.976733 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"84783f363dda1053c7f032969b8a9b632ff711d6f0764371f0d881ee3ad20516"} Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.976745 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"d4e49bf9ad485c0fe0bbb4a2dbc2f08f31e1f3158c54e7e7a0fa81f3f0046870"} Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.978566 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e3b5-account-create-fcdjv" event={"ID":"2e5a89ea-9d74-48e6-8255-62ebd3feaa52","Type":"ContainerStarted","Data":"dde9f40ba27d41aad5ca56d92142e35bed603ba909423eba0d09a696a6b9a237"} Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.978599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e3b5-account-create-fcdjv" event={"ID":"2e5a89ea-9d74-48e6-8255-62ebd3feaa52","Type":"ContainerStarted","Data":"048152c3ffcf5a0782c2aca7ade8aacb0099cf272e0bec5c66dd76c40d74ea98"} Oct 10 07:09:23 crc kubenswrapper[4732]: I1010 07:09:23.980906 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" event={"ID":"04dced02-6453-4b70-8eba-8d15648e1177","Type":"ContainerStarted","Data":"9c29fe100e1febdd343392e667a3891265a08e3b3c1c1c8c88182d7204560c16"} Oct 10 07:09:24 crc kubenswrapper[4732]: I1010 07:09:24.000954 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-e3b5-account-create-fcdjv" podStartSLOduration=8.000937236 podStartE2EDuration="8.000937236s" podCreationTimestamp="2025-10-10 07:09:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:23.991765536 +0000 UTC m=+1091.061356797" watchObservedRunningTime="2025-10-10 07:09:24.000937236 +0000 UTC m=+1091.070528477" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.017381 4732 generic.go:334] "Generic (PLEG): container finished" podID="04dced02-6453-4b70-8eba-8d15648e1177" containerID="0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3" exitCode=0 Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.017684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" event={"ID":"04dced02-6453-4b70-8eba-8d15648e1177","Type":"ContainerDied","Data":"0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3"} Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.049515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"6aa268d1067b6515b564fbd351c694b7f8bd27f2ca765a2e848302e1ec2da0ec"} Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.050362 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"7c44462876be789a8e5caeabb0625c49ae5413ec6663dae73e6b157a5e977d76"} Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.050518 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"c875be356f22174bd7fe912809d07ce631dcb17edd6d1d6aabc340d517cc6551"} Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.050597 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerStarted","Data":"7740e27fefba27d5e80df5ff662cfd5fc4b86c96b608fa32c24f8d2b25cee4a2"} Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.052665 4732 generic.go:334] "Generic (PLEG): container finished" podID="2e5a89ea-9d74-48e6-8255-62ebd3feaa52" containerID="dde9f40ba27d41aad5ca56d92142e35bed603ba909423eba0d09a696a6b9a237" exitCode=0 Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.052707 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e3b5-account-create-fcdjv" event={"ID":"2e5a89ea-9d74-48e6-8255-62ebd3feaa52","Type":"ContainerDied","Data":"dde9f40ba27d41aad5ca56d92142e35bed603ba909423eba0d09a696a6b9a237"} Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.110071 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.328516511 podStartE2EDuration="45.110052553s" podCreationTimestamp="2025-10-10 07:08:40 +0000 UTC" firstStartedPulling="2025-10-10 07:08:58.370553681 +0000 UTC m=+1065.440144922" lastFinishedPulling="2025-10-10 07:09:23.152089723 +0000 UTC m=+1090.221680964" observedRunningTime="2025-10-10 07:09:25.091506739 +0000 UTC m=+1092.161098000" watchObservedRunningTime="2025-10-10 07:09:25.110052553 +0000 UTC m=+1092.179643794" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.356264 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b7fd5b57-xttsm"] Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.356729 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.356763 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.387804 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-655549c8f-jpr8k"] Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.389168 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.397768 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.408379 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-655549c8f-jpr8k"] Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.530309 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-nb\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.530380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-swift-storage-0\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.530426 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-config\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.530512 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26sv\" (UniqueName: \"kubernetes.io/projected/53825ae6-1e7b-469e-9264-f16520e57021-kube-api-access-z26sv\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.530582 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-svc\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.530609 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-sb\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.632495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-svc\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.632542 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-sb\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.632588 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-nb\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.632607 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-swift-storage-0\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.632635 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-config\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.632700 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z26sv\" (UniqueName: \"kubernetes.io/projected/53825ae6-1e7b-469e-9264-f16520e57021-kube-api-access-z26sv\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.633738 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-svc\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.634059 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-nb\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.634277 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-sb\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.634416 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-swift-storage-0\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.634525 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-config\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.655790 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26sv\" (UniqueName: \"kubernetes.io/projected/53825ae6-1e7b-469e-9264-f16520e57021-kube-api-access-z26sv\") pod \"dnsmasq-dns-655549c8f-jpr8k\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.704033 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.918907 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d0a1-account-create-4wdrt"] Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.920274 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0a1-account-create-4wdrt" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.923324 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 10 07:09:25 crc kubenswrapper[4732]: I1010 07:09:25.936506 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d0a1-account-create-4wdrt"] Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.038375 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwt9p\" (UniqueName: \"kubernetes.io/projected/06ba01b5-953d-4178-b73c-5b5f13268e13-kube-api-access-nwt9p\") pod \"barbican-d0a1-account-create-4wdrt\" (UID: \"06ba01b5-953d-4178-b73c-5b5f13268e13\") " pod="openstack/barbican-d0a1-account-create-4wdrt" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.062816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" event={"ID":"04dced02-6453-4b70-8eba-8d15648e1177","Type":"ContainerStarted","Data":"5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98"} Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.063872 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.085492 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" podStartSLOduration=5.085470371 podStartE2EDuration="5.085470371s" podCreationTimestamp="2025-10-10 07:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:26.080008972 +0000 UTC m=+1093.149600233" watchObservedRunningTime="2025-10-10 07:09:26.085470371 +0000 UTC m=+1093.155061612" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.127429 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-62ae-account-create-tps8w"] Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.129325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62ae-account-create-tps8w" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.132510 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.139987 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-62ae-account-create-tps8w"] Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.142248 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwt9p\" (UniqueName: \"kubernetes.io/projected/06ba01b5-953d-4178-b73c-5b5f13268e13-kube-api-access-nwt9p\") pod \"barbican-d0a1-account-create-4wdrt\" (UID: \"06ba01b5-953d-4178-b73c-5b5f13268e13\") " pod="openstack/barbican-d0a1-account-create-4wdrt" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.183990 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwt9p\" (UniqueName: \"kubernetes.io/projected/06ba01b5-953d-4178-b73c-5b5f13268e13-kube-api-access-nwt9p\") pod \"barbican-d0a1-account-create-4wdrt\" (UID: \"06ba01b5-953d-4178-b73c-5b5f13268e13\") " pod="openstack/barbican-d0a1-account-create-4wdrt" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.187997 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-655549c8f-jpr8k"] Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.244328 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd5vh\" (UniqueName: \"kubernetes.io/projected/eca127ce-2fe2-49bd-94f2-d772fcffb2d5-kube-api-access-nd5vh\") pod \"cinder-62ae-account-create-tps8w\" (UID: \"eca127ce-2fe2-49bd-94f2-d772fcffb2d5\") " pod="openstack/cinder-62ae-account-create-tps8w" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.279236 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0a1-account-create-4wdrt" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.347556 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd5vh\" (UniqueName: \"kubernetes.io/projected/eca127ce-2fe2-49bd-94f2-d772fcffb2d5-kube-api-access-nd5vh\") pod \"cinder-62ae-account-create-tps8w\" (UID: \"eca127ce-2fe2-49bd-94f2-d772fcffb2d5\") " pod="openstack/cinder-62ae-account-create-tps8w" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.369271 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd5vh\" (UniqueName: \"kubernetes.io/projected/eca127ce-2fe2-49bd-94f2-d772fcffb2d5-kube-api-access-nd5vh\") pod \"cinder-62ae-account-create-tps8w\" (UID: \"eca127ce-2fe2-49bd-94f2-d772fcffb2d5\") " pod="openstack/cinder-62ae-account-create-tps8w" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.410090 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3b5-account-create-fcdjv" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.459306 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62ae-account-create-tps8w" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.551209 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fptnq\" (UniqueName: \"kubernetes.io/projected/2e5a89ea-9d74-48e6-8255-62ebd3feaa52-kube-api-access-fptnq\") pod \"2e5a89ea-9d74-48e6-8255-62ebd3feaa52\" (UID: \"2e5a89ea-9d74-48e6-8255-62ebd3feaa52\") " Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.556628 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e5a89ea-9d74-48e6-8255-62ebd3feaa52-kube-api-access-fptnq" (OuterVolumeSpecName: "kube-api-access-fptnq") pod "2e5a89ea-9d74-48e6-8255-62ebd3feaa52" (UID: "2e5a89ea-9d74-48e6-8255-62ebd3feaa52"). InnerVolumeSpecName "kube-api-access-fptnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.654794 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fptnq\" (UniqueName: \"kubernetes.io/projected/2e5a89ea-9d74-48e6-8255-62ebd3feaa52-kube-api-access-fptnq\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.723179 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d0a1-account-create-4wdrt"] Oct 10 07:09:26 crc kubenswrapper[4732]: W1010 07:09:26.765092 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ba01b5_953d_4178_b73c_5b5f13268e13.slice/crio-10af41bab4ed5e05de37801b8cab454c3816e0c21de8818c3ca24e96032ae3a9 WatchSource:0}: Error finding container 10af41bab4ed5e05de37801b8cab454c3816e0c21de8818c3ca24e96032ae3a9: Status 404 returned error can't find the container with id 10af41bab4ed5e05de37801b8cab454c3816e0c21de8818c3ca24e96032ae3a9 Oct 10 07:09:26 crc kubenswrapper[4732]: I1010 07:09:26.903261 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-62ae-account-create-tps8w"] Oct 10 07:09:26 crc kubenswrapper[4732]: W1010 07:09:26.908969 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeca127ce_2fe2_49bd_94f2_d772fcffb2d5.slice/crio-5c6a1f1873e1cdd4ecf766b0c8d7239bd39db7bb20ef6846d7bbbf97d6ff2a7e WatchSource:0}: Error finding container 5c6a1f1873e1cdd4ecf766b0c8d7239bd39db7bb20ef6846d7bbbf97d6ff2a7e: Status 404 returned error can't find the container with id 5c6a1f1873e1cdd4ecf766b0c8d7239bd39db7bb20ef6846d7bbbf97d6ff2a7e Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.071605 4732 generic.go:334] "Generic (PLEG): container finished" podID="801dc083-4a38-4af1-9bf1-b40a3c204e09" containerID="0e4ff507cdac3d345c7599b2f38569e72cbb904578805d9196ce57b351e8fc3a" exitCode=0 Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.071712 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2tz8" event={"ID":"801dc083-4a38-4af1-9bf1-b40a3c204e09","Type":"ContainerDied","Data":"0e4ff507cdac3d345c7599b2f38569e72cbb904578805d9196ce57b351e8fc3a"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.073584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-62ae-account-create-tps8w" event={"ID":"eca127ce-2fe2-49bd-94f2-d772fcffb2d5","Type":"ContainerStarted","Data":"c1f28765adbde199f44b27a3b0a6b9a0ff884e2ac8249464dee32eb1609af819"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.073610 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-62ae-account-create-tps8w" event={"ID":"eca127ce-2fe2-49bd-94f2-d772fcffb2d5","Type":"ContainerStarted","Data":"5c6a1f1873e1cdd4ecf766b0c8d7239bd39db7bb20ef6846d7bbbf97d6ff2a7e"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.076012 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e3b5-account-create-fcdjv" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.076076 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e3b5-account-create-fcdjv" event={"ID":"2e5a89ea-9d74-48e6-8255-62ebd3feaa52","Type":"ContainerDied","Data":"048152c3ffcf5a0782c2aca7ade8aacb0099cf272e0bec5c66dd76c40d74ea98"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.076106 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="048152c3ffcf5a0782c2aca7ade8aacb0099cf272e0bec5c66dd76c40d74ea98" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.077303 4732 generic.go:334] "Generic (PLEG): container finished" podID="06ba01b5-953d-4178-b73c-5b5f13268e13" containerID="33f31a62b7e6468ccf5c144e82ff3c82dc099694dfba101874c4dfcecc1dcd59" exitCode=0 Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.077359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d0a1-account-create-4wdrt" event={"ID":"06ba01b5-953d-4178-b73c-5b5f13268e13","Type":"ContainerDied","Data":"33f31a62b7e6468ccf5c144e82ff3c82dc099694dfba101874c4dfcecc1dcd59"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.077382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d0a1-account-create-4wdrt" event={"ID":"06ba01b5-953d-4178-b73c-5b5f13268e13","Type":"ContainerStarted","Data":"10af41bab4ed5e05de37801b8cab454c3816e0c21de8818c3ca24e96032ae3a9"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.078468 4732 generic.go:334] "Generic (PLEG): container finished" podID="53825ae6-1e7b-469e-9264-f16520e57021" containerID="36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d" exitCode=0 Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.078544 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" event={"ID":"53825ae6-1e7b-469e-9264-f16520e57021","Type":"ContainerDied","Data":"36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.078576 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" event={"ID":"53825ae6-1e7b-469e-9264-f16520e57021","Type":"ContainerStarted","Data":"28edd2e3dc2f76685d7d460d5234a3f1fe24d8f39f3edb603c274b0d33be9ec1"} Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.078739 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" podUID="04dced02-6453-4b70-8eba-8d15648e1177" containerName="dnsmasq-dns" containerID="cri-o://5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98" gracePeriod=10 Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.133983 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-62ae-account-create-tps8w" podStartSLOduration=1.1339619380000001 podStartE2EDuration="1.133961938s" podCreationTimestamp="2025-10-10 07:09:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:27.126104554 +0000 UTC m=+1094.195695795" watchObservedRunningTime="2025-10-10 07:09:27.133961938 +0000 UTC m=+1094.203553179" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.563915 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.670119 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v9j7\" (UniqueName: \"kubernetes.io/projected/04dced02-6453-4b70-8eba-8d15648e1177-kube-api-access-6v9j7\") pod \"04dced02-6453-4b70-8eba-8d15648e1177\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.670169 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-dns-svc\") pod \"04dced02-6453-4b70-8eba-8d15648e1177\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.670204 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-config\") pod \"04dced02-6453-4b70-8eba-8d15648e1177\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.670266 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-nb\") pod \"04dced02-6453-4b70-8eba-8d15648e1177\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.670301 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-sb\") pod \"04dced02-6453-4b70-8eba-8d15648e1177\" (UID: \"04dced02-6453-4b70-8eba-8d15648e1177\") " Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.681049 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04dced02-6453-4b70-8eba-8d15648e1177-kube-api-access-6v9j7" (OuterVolumeSpecName: "kube-api-access-6v9j7") pod "04dced02-6453-4b70-8eba-8d15648e1177" (UID: "04dced02-6453-4b70-8eba-8d15648e1177"). InnerVolumeSpecName "kube-api-access-6v9j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.713129 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04dced02-6453-4b70-8eba-8d15648e1177" (UID: "04dced02-6453-4b70-8eba-8d15648e1177"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.717510 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04dced02-6453-4b70-8eba-8d15648e1177" (UID: "04dced02-6453-4b70-8eba-8d15648e1177"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.717562 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04dced02-6453-4b70-8eba-8d15648e1177" (UID: "04dced02-6453-4b70-8eba-8d15648e1177"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.720875 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-config" (OuterVolumeSpecName: "config") pod "04dced02-6453-4b70-8eba-8d15648e1177" (UID: "04dced02-6453-4b70-8eba-8d15648e1177"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.773330 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v9j7\" (UniqueName: \"kubernetes.io/projected/04dced02-6453-4b70-8eba-8d15648e1177-kube-api-access-6v9j7\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.773385 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.773397 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.773409 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:27 crc kubenswrapper[4732]: I1010 07:09:27.773418 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04dced02-6453-4b70-8eba-8d15648e1177-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.086581 4732 generic.go:334] "Generic (PLEG): container finished" podID="04dced02-6453-4b70-8eba-8d15648e1177" containerID="5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98" exitCode=0 Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.086619 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" event={"ID":"04dced02-6453-4b70-8eba-8d15648e1177","Type":"ContainerDied","Data":"5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98"} Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.086663 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" event={"ID":"04dced02-6453-4b70-8eba-8d15648e1177","Type":"ContainerDied","Data":"9c29fe100e1febdd343392e667a3891265a08e3b3c1c1c8c88182d7204560c16"} Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.086682 4732 scope.go:117] "RemoveContainer" containerID="5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.086742 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78b7fd5b57-xttsm" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.090234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" event={"ID":"53825ae6-1e7b-469e-9264-f16520e57021","Type":"ContainerStarted","Data":"b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd"} Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.090570 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.092704 4732 generic.go:334] "Generic (PLEG): container finished" podID="eca127ce-2fe2-49bd-94f2-d772fcffb2d5" containerID="c1f28765adbde199f44b27a3b0a6b9a0ff884e2ac8249464dee32eb1609af819" exitCode=0 Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.092928 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-62ae-account-create-tps8w" event={"ID":"eca127ce-2fe2-49bd-94f2-d772fcffb2d5","Type":"ContainerDied","Data":"c1f28765adbde199f44b27a3b0a6b9a0ff884e2ac8249464dee32eb1609af819"} Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.115152 4732 scope.go:117] "RemoveContainer" containerID="0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.117284 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" podStartSLOduration=3.117274631 podStartE2EDuration="3.117274631s" podCreationTimestamp="2025-10-10 07:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:28.110671281 +0000 UTC m=+1095.180262542" watchObservedRunningTime="2025-10-10 07:09:28.117274631 +0000 UTC m=+1095.186865872" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.153819 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78b7fd5b57-xttsm"] Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.160139 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78b7fd5b57-xttsm"] Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.163862 4732 scope.go:117] "RemoveContainer" containerID="5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98" Oct 10 07:09:28 crc kubenswrapper[4732]: E1010 07:09:28.164263 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98\": container with ID starting with 5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98 not found: ID does not exist" containerID="5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.164302 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98"} err="failed to get container status \"5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98\": rpc error: code = NotFound desc = could not find container \"5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98\": container with ID starting with 5be0e8653454cafaef0aa1f2f3b4864a6f08da6068cf2a5dac097b4ba59feb98 not found: ID does not exist" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.164328 4732 scope.go:117] "RemoveContainer" containerID="0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3" Oct 10 07:09:28 crc kubenswrapper[4732]: E1010 07:09:28.165038 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3\": container with ID starting with 0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3 not found: ID does not exist" containerID="0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.165077 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3"} err="failed to get container status \"0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3\": rpc error: code = NotFound desc = could not find container \"0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3\": container with ID starting with 0654b110d7c79d105b207393bd75d5a9f2c614d87ddd01696bc64d8f497665e3 not found: ID does not exist" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.506096 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0a1-account-create-4wdrt" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.511487 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.586335 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwt9p\" (UniqueName: \"kubernetes.io/projected/06ba01b5-953d-4178-b73c-5b5f13268e13-kube-api-access-nwt9p\") pod \"06ba01b5-953d-4178-b73c-5b5f13268e13\" (UID: \"06ba01b5-953d-4178-b73c-5b5f13268e13\") " Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.586414 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjxd9\" (UniqueName: \"kubernetes.io/projected/801dc083-4a38-4af1-9bf1-b40a3c204e09-kube-api-access-vjxd9\") pod \"801dc083-4a38-4af1-9bf1-b40a3c204e09\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.586451 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-config-data\") pod \"801dc083-4a38-4af1-9bf1-b40a3c204e09\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.586472 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-combined-ca-bundle\") pod \"801dc083-4a38-4af1-9bf1-b40a3c204e09\" (UID: \"801dc083-4a38-4af1-9bf1-b40a3c204e09\") " Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.591657 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ba01b5-953d-4178-b73c-5b5f13268e13-kube-api-access-nwt9p" (OuterVolumeSpecName: "kube-api-access-nwt9p") pod "06ba01b5-953d-4178-b73c-5b5f13268e13" (UID: "06ba01b5-953d-4178-b73c-5b5f13268e13"). InnerVolumeSpecName "kube-api-access-nwt9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.593200 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801dc083-4a38-4af1-9bf1-b40a3c204e09-kube-api-access-vjxd9" (OuterVolumeSpecName: "kube-api-access-vjxd9") pod "801dc083-4a38-4af1-9bf1-b40a3c204e09" (UID: "801dc083-4a38-4af1-9bf1-b40a3c204e09"). InnerVolumeSpecName "kube-api-access-vjxd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.620375 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801dc083-4a38-4af1-9bf1-b40a3c204e09" (UID: "801dc083-4a38-4af1-9bf1-b40a3c204e09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.630740 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-config-data" (OuterVolumeSpecName: "config-data") pod "801dc083-4a38-4af1-9bf1-b40a3c204e09" (UID: "801dc083-4a38-4af1-9bf1-b40a3c204e09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.688369 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwt9p\" (UniqueName: \"kubernetes.io/projected/06ba01b5-953d-4178-b73c-5b5f13268e13-kube-api-access-nwt9p\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.688421 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjxd9\" (UniqueName: \"kubernetes.io/projected/801dc083-4a38-4af1-9bf1-b40a3c204e09-kube-api-access-vjxd9\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.688435 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:28 crc kubenswrapper[4732]: I1010 07:09:28.688448 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801dc083-4a38-4af1-9bf1-b40a3c204e09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.103940 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d0a1-account-create-4wdrt" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.104210 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d0a1-account-create-4wdrt" event={"ID":"06ba01b5-953d-4178-b73c-5b5f13268e13","Type":"ContainerDied","Data":"10af41bab4ed5e05de37801b8cab454c3816e0c21de8818c3ca24e96032ae3a9"} Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.104851 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10af41bab4ed5e05de37801b8cab454c3816e0c21de8818c3ca24e96032ae3a9" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.107316 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p2tz8" event={"ID":"801dc083-4a38-4af1-9bf1-b40a3c204e09","Type":"ContainerDied","Data":"a0ce100642fe0104a42da1ffbd6b285d61a65715234afd03c74bd09eb18b72f1"} Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.107342 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0ce100642fe0104a42da1ffbd6b285d61a65715234afd03c74bd09eb18b72f1" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.107381 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p2tz8" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.348396 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-655549c8f-jpr8k"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.377941 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6jjrv"] Oct 10 07:09:29 crc kubenswrapper[4732]: E1010 07:09:29.378392 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ba01b5-953d-4178-b73c-5b5f13268e13" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378424 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ba01b5-953d-4178-b73c-5b5f13268e13" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: E1010 07:09:29.378446 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dced02-6453-4b70-8eba-8d15648e1177" containerName="init" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378454 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dced02-6453-4b70-8eba-8d15648e1177" containerName="init" Oct 10 07:09:29 crc kubenswrapper[4732]: E1010 07:09:29.378469 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e5a89ea-9d74-48e6-8255-62ebd3feaa52" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378477 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e5a89ea-9d74-48e6-8255-62ebd3feaa52" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: E1010 07:09:29.378492 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801dc083-4a38-4af1-9bf1-b40a3c204e09" containerName="keystone-db-sync" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378499 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="801dc083-4a38-4af1-9bf1-b40a3c204e09" containerName="keystone-db-sync" Oct 10 07:09:29 crc kubenswrapper[4732]: E1010 07:09:29.378515 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dced02-6453-4b70-8eba-8d15648e1177" containerName="dnsmasq-dns" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378522 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dced02-6453-4b70-8eba-8d15648e1177" containerName="dnsmasq-dns" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378751 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e5a89ea-9d74-48e6-8255-62ebd3feaa52" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378769 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="801dc083-4a38-4af1-9bf1-b40a3c204e09" containerName="keystone-db-sync" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378792 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ba01b5-953d-4178-b73c-5b5f13268e13" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.378806 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="04dced02-6453-4b70-8eba-8d15648e1177" containerName="dnsmasq-dns" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.379482 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.383569 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.383913 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.384086 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.384274 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8gnjq" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.391586 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6jjrv"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.400017 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79c8b784dc-sbvpb"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.401793 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.426013 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c8b784dc-sbvpb"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.503526 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-credential-keys\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.503584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-svc\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.503862 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-config-data\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.503909 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxwxd\" (UniqueName: \"kubernetes.io/projected/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-kube-api-access-dxwxd\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.503925 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-scripts\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.503943 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-swift-storage-0\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.504000 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-config\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.504075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8sm\" (UniqueName: \"kubernetes.io/projected/94c15080-3c41-4889-9266-977e7c858d18-kube-api-access-hn8sm\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.504138 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-fernet-keys\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.504220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-nb\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.504396 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-combined-ca-bundle\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.504502 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-sb\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.523586 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62ae-account-create-tps8w" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.593764 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:09:29 crc kubenswrapper[4732]: E1010 07:09:29.594960 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca127ce-2fe2-49bd-94f2-d772fcffb2d5" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.595033 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca127ce-2fe2-49bd-94f2-d772fcffb2d5" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.595281 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca127ce-2fe2-49bd-94f2-d772fcffb2d5" containerName="mariadb-account-create" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.597301 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.610208 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd5vh\" (UniqueName: \"kubernetes.io/projected/eca127ce-2fe2-49bd-94f2-d772fcffb2d5-kube-api-access-nd5vh\") pod \"eca127ce-2fe2-49bd-94f2-d772fcffb2d5\" (UID: \"eca127ce-2fe2-49bd-94f2-d772fcffb2d5\") " Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.610947 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-combined-ca-bundle\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.611112 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-sb\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.611257 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-credential-keys\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.611372 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-svc\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.611473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-config-data\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.611566 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxwxd\" (UniqueName: \"kubernetes.io/projected/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-kube-api-access-dxwxd\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.611973 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-scripts\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.612060 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-swift-storage-0\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.612179 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-config\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.612266 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8sm\" (UniqueName: \"kubernetes.io/projected/94c15080-3c41-4889-9266-977e7c858d18-kube-api-access-hn8sm\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.612369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-fernet-keys\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.612470 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-nb\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.613584 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-svc\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.613790 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-nb\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.616952 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-config\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.620836 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-swift-storage-0\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.621497 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-sb\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.624929 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.625198 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.629827 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-combined-ca-bundle\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.630121 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.630573 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-credential-keys\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.637253 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca127ce-2fe2-49bd-94f2-d772fcffb2d5-kube-api-access-nd5vh" (OuterVolumeSpecName: "kube-api-access-nd5vh") pod "eca127ce-2fe2-49bd-94f2-d772fcffb2d5" (UID: "eca127ce-2fe2-49bd-94f2-d772fcffb2d5"). InnerVolumeSpecName "kube-api-access-nd5vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.637779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-fernet-keys\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.641782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-scripts\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.655767 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxwxd\" (UniqueName: \"kubernetes.io/projected/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-kube-api-access-dxwxd\") pod \"dnsmasq-dns-79c8b784dc-sbvpb\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.661117 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-config-data\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.661228 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8sm\" (UniqueName: \"kubernetes.io/projected/94c15080-3c41-4889-9266-977e7c858d18-kube-api-access-hn8sm\") pod \"keystone-bootstrap-6jjrv\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.702772 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04dced02-6453-4b70-8eba-8d15648e1177" path="/var/lib/kubelet/pods/04dced02-6453-4b70-8eba-8d15648e1177/volumes" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.712207 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cx6h2"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714116 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-log-httpd\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714172 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-scripts\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714192 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc6mf\" (UniqueName: \"kubernetes.io/projected/7abb736c-8131-4268-9d1c-3ecf24023962-kube-api-access-bc6mf\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714228 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-config-data\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714276 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714303 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-run-httpd\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714320 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.714376 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd5vh\" (UniqueName: \"kubernetes.io/projected/eca127ce-2fe2-49bd-94f2-d772fcffb2d5-kube-api-access-nd5vh\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.715211 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.718896 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.720164 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x8v5s" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.720826 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.744752 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cx6h2"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.758970 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c8b784dc-sbvpb"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.759732 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.794210 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.798653 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d94456597-gz79x"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.800080 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.808320 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d94456597-gz79x"] Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819476 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-log-httpd\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819552 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-config-data\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819585 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-scripts\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc6mf\" (UniqueName: \"kubernetes.io/projected/7abb736c-8131-4268-9d1c-3ecf24023962-kube-api-access-bc6mf\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819648 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km56t\" (UniqueName: \"kubernetes.io/projected/e60af77c-522d-441f-9174-a0242edc0361-kube-api-access-km56t\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-config-data\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819761 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-combined-ca-bundle\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819812 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819847 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-run-httpd\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819906 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e60af77c-522d-441f-9174-a0242edc0361-logs\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.819953 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-scripts\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.820392 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-log-httpd\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.821044 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-run-httpd\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.829473 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.830262 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-config-data\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.834440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-scripts\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.837157 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.843647 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc6mf\" (UniqueName: \"kubernetes.io/projected/7abb736c-8131-4268-9d1c-3ecf24023962-kube-api-access-bc6mf\") pod \"ceilometer-0\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " pod="openstack/ceilometer-0" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.921520 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-scripts\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922117 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-svc\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922144 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-sb\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922188 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-config-data\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922277 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km56t\" (UniqueName: \"kubernetes.io/projected/e60af77c-522d-441f-9174-a0242edc0361-kube-api-access-km56t\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-config\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922342 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-combined-ca-bundle\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922365 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5b7j\" (UniqueName: \"kubernetes.io/projected/f5d0e89e-6480-4747-a693-9f3cac1fb87d-kube-api-access-s5b7j\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922394 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-swift-storage-0\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922423 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-nb\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922448 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e60af77c-522d-441f-9174-a0242edc0361-logs\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.922724 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e60af77c-522d-441f-9174-a0242edc0361-logs\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.925099 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-scripts\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.926555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-combined-ca-bundle\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.932949 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-config-data\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:29 crc kubenswrapper[4732]: I1010 07:09:29.939618 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km56t\" (UniqueName: \"kubernetes.io/projected/e60af77c-522d-441f-9174-a0242edc0361-kube-api-access-km56t\") pod \"placement-db-sync-cx6h2\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.023508 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.024171 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-svc\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.024227 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-sb\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.024338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-config\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.024383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5b7j\" (UniqueName: \"kubernetes.io/projected/f5d0e89e-6480-4747-a693-9f3cac1fb87d-kube-api-access-s5b7j\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.024423 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-swift-storage-0\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.024469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-nb\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.024982 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-svc\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.025384 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-nb\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.025544 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-config\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.028885 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-sb\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.028926 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-swift-storage-0\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.045221 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5b7j\" (UniqueName: \"kubernetes.io/projected/f5d0e89e-6480-4747-a693-9f3cac1fb87d-kube-api-access-s5b7j\") pod \"dnsmasq-dns-7d94456597-gz79x\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.065629 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cx6h2" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.125033 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-62ae-account-create-tps8w" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.125011 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-62ae-account-create-tps8w" event={"ID":"eca127ce-2fe2-49bd-94f2-d772fcffb2d5","Type":"ContainerDied","Data":"5c6a1f1873e1cdd4ecf766b0c8d7239bd39db7bb20ef6846d7bbbf97d6ff2a7e"} Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.125389 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c6a1f1873e1cdd4ecf766b0c8d7239bd39db7bb20ef6846d7bbbf97d6ff2a7e" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.125150 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" podUID="53825ae6-1e7b-469e-9264-f16520e57021" containerName="dnsmasq-dns" containerID="cri-o://b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd" gracePeriod=10 Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.144845 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.280177 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c8b784dc-sbvpb"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.342542 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6jjrv"] Oct 10 07:09:30 crc kubenswrapper[4732]: W1010 07:09:30.364863 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94c15080_3c41_4889_9266_977e7c858d18.slice/crio-524b201636b8e44d887ec042cda3e0ecaa6d05c33fa060f001a3ae1076880b9d WatchSource:0}: Error finding container 524b201636b8e44d887ec042cda3e0ecaa6d05c33fa060f001a3ae1076880b9d: Status 404 returned error can't find the container with id 524b201636b8e44d887ec042cda3e0ecaa6d05c33fa060f001a3ae1076880b9d Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.496293 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.498108 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.502442 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.502625 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.502851 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.502978 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rr4dh" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.510240 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.578850 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.586435 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.589556 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.590052 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.602998 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.626765 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cx6h2"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.634103 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.634976 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.635020 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.635047 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.635105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnhkl\" (UniqueName: \"kubernetes.io/projected/0c9dbf1d-b64e-496f-ad78-593db301c457-kube-api-access-nnhkl\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.635122 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.635154 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-logs\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.635182 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.635210 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: W1010 07:09:30.664476 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7abb736c_8131_4268_9d1c_3ecf24023962.slice/crio-93be9b4e14917a6bae6a6b28962440e97b7b6a4188a00f9f3f61566c81fe912f WatchSource:0}: Error finding container 93be9b4e14917a6bae6a6b28962440e97b7b6a4188a00f9f3f61566c81fe912f: Status 404 returned error can't find the container with id 93be9b4e14917a6bae6a6b28962440e97b7b6a4188a00f9f3f61566c81fe912f Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.736848 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.736866 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737318 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737363 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnhkl\" (UniqueName: \"kubernetes.io/projected/0c9dbf1d-b64e-496f-ad78-593db301c457-kube-api-access-nnhkl\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737392 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgnqm\" (UniqueName: \"kubernetes.io/projected/5f83b91b-33d1-44fa-9e1c-999b804e51c7-kube-api-access-qgnqm\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737434 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737461 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737486 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737513 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737547 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-logs\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737632 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737822 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737871 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737913 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.737948 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.738003 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.739122 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-logs\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.739424 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.740134 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.756424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.762270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-config-data\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.770425 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.770526 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnhkl\" (UniqueName: \"kubernetes.io/projected/0c9dbf1d-b64e-496f-ad78-593db301c457-kube-api-access-nnhkl\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.775142 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-scripts\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.786892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849145 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-sb\") pod \"53825ae6-1e7b-469e-9264-f16520e57021\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849224 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-config\") pod \"53825ae6-1e7b-469e-9264-f16520e57021\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849271 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-swift-storage-0\") pod \"53825ae6-1e7b-469e-9264-f16520e57021\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849408 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-nb\") pod \"53825ae6-1e7b-469e-9264-f16520e57021\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849438 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z26sv\" (UniqueName: \"kubernetes.io/projected/53825ae6-1e7b-469e-9264-f16520e57021-kube-api-access-z26sv\") pod \"53825ae6-1e7b-469e-9264-f16520e57021\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849458 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-svc\") pod \"53825ae6-1e7b-469e-9264-f16520e57021\" (UID: \"53825ae6-1e7b-469e-9264-f16520e57021\") " Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849587 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.849672 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.850149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgnqm\" (UniqueName: \"kubernetes.io/projected/5f83b91b-33d1-44fa-9e1c-999b804e51c7-kube-api-access-qgnqm\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.850182 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.850203 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.850223 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.850278 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.850307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.850653 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.853392 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.855654 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.856383 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.856469 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.860226 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.862531 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.868898 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53825ae6-1e7b-469e-9264-f16520e57021-kube-api-access-z26sv" (OuterVolumeSpecName: "kube-api-access-z26sv") pod "53825ae6-1e7b-469e-9264-f16520e57021" (UID: "53825ae6-1e7b-469e-9264-f16520e57021"). InnerVolumeSpecName "kube-api-access-z26sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.869430 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.876287 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgnqm\" (UniqueName: \"kubernetes.io/projected/5f83b91b-33d1-44fa-9e1c-999b804e51c7-kube-api-access-qgnqm\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.909210 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d94456597-gz79x"] Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.915340 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:30 crc kubenswrapper[4732]: W1010 07:09:30.925256 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d0e89e_6480_4747_a693_9f3cac1fb87d.slice/crio-57832ce2ecacfeff47b3189dd47a53478f22570c9b085158d7000f2f99316d35 WatchSource:0}: Error finding container 57832ce2ecacfeff47b3189dd47a53478f22570c9b085158d7000f2f99316d35: Status 404 returned error can't find the container with id 57832ce2ecacfeff47b3189dd47a53478f22570c9b085158d7000f2f99316d35 Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.945948 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53825ae6-1e7b-469e-9264-f16520e57021" (UID: "53825ae6-1e7b-469e-9264-f16520e57021"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.958171 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z26sv\" (UniqueName: \"kubernetes.io/projected/53825ae6-1e7b-469e-9264-f16520e57021-kube-api-access-z26sv\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.958214 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.959035 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53825ae6-1e7b-469e-9264-f16520e57021" (UID: "53825ae6-1e7b-469e-9264-f16520e57021"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.962848 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53825ae6-1e7b-469e-9264-f16520e57021" (UID: "53825ae6-1e7b-469e-9264-f16520e57021"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.978476 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-config" (OuterVolumeSpecName: "config") pod "53825ae6-1e7b-469e-9264-f16520e57021" (UID: "53825ae6-1e7b-469e-9264-f16520e57021"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:30 crc kubenswrapper[4732]: I1010 07:09:30.989039 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53825ae6-1e7b-469e-9264-f16520e57021" (UID: "53825ae6-1e7b-469e-9264-f16520e57021"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.059497 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.059766 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.059776 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.059785 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53825ae6-1e7b-469e-9264-f16520e57021-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.132978 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerStarted","Data":"93be9b4e14917a6bae6a6b28962440e97b7b6a4188a00f9f3f61566c81fe912f"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.134222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jjrv" event={"ID":"94c15080-3c41-4889-9266-977e7c858d18","Type":"ContainerStarted","Data":"b767fdf343e2f054e157561352aab0e3261a07fc8d390dbd234c4209ebbdc59a"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.134261 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jjrv" event={"ID":"94c15080-3c41-4889-9266-977e7c858d18","Type":"ContainerStarted","Data":"524b201636b8e44d887ec042cda3e0ecaa6d05c33fa060f001a3ae1076880b9d"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.136170 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94456597-gz79x" event={"ID":"f5d0e89e-6480-4747-a693-9f3cac1fb87d","Type":"ContainerStarted","Data":"57832ce2ecacfeff47b3189dd47a53478f22570c9b085158d7000f2f99316d35"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.136936 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cx6h2" event={"ID":"e60af77c-522d-441f-9174-a0242edc0361","Type":"ContainerStarted","Data":"adde8706f52beb2b80ad945572a617167cc6c00e1977c5cdf85386673e7b0049"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.140200 4732 generic.go:334] "Generic (PLEG): container finished" podID="b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" containerID="7c4bf2f39bae28b55efdc8966ac8c233f8dedcdefb6c8c7fef65ab3a01f01439" exitCode=0 Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.140390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" event={"ID":"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c","Type":"ContainerDied","Data":"7c4bf2f39bae28b55efdc8966ac8c233f8dedcdefb6c8c7fef65ab3a01f01439"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.141238 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" event={"ID":"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c","Type":"ContainerStarted","Data":"0728a86a1fa7c5d39d95f1dbab9883a717e0a37007cb1f3ff3bb7eb3b2616b39"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.144913 4732 generic.go:334] "Generic (PLEG): container finished" podID="53825ae6-1e7b-469e-9264-f16520e57021" containerID="b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd" exitCode=0 Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.144957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" event={"ID":"53825ae6-1e7b-469e-9264-f16520e57021","Type":"ContainerDied","Data":"b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.144985 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" event={"ID":"53825ae6-1e7b-469e-9264-f16520e57021","Type":"ContainerDied","Data":"28edd2e3dc2f76685d7d460d5234a3f1fe24d8f39f3edb603c274b0d33be9ec1"} Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.145007 4732 scope.go:117] "RemoveContainer" containerID="b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.145149 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-655549c8f-jpr8k" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.166487 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6jjrv" podStartSLOduration=2.166343258 podStartE2EDuration="2.166343258s" podCreationTimestamp="2025-10-10 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:31.162948925 +0000 UTC m=+1098.232540196" watchObservedRunningTime="2025-10-10 07:09:31.166343258 +0000 UTC m=+1098.235934509" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.209846 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.237762 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-x6n9t"] Oct 10 07:09:31 crc kubenswrapper[4732]: E1010 07:09:31.238157 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53825ae6-1e7b-469e-9264-f16520e57021" containerName="init" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.238173 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="53825ae6-1e7b-469e-9264-f16520e57021" containerName="init" Oct 10 07:09:31 crc kubenswrapper[4732]: E1010 07:09:31.238186 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53825ae6-1e7b-469e-9264-f16520e57021" containerName="dnsmasq-dns" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.238192 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="53825ae6-1e7b-469e-9264-f16520e57021" containerName="dnsmasq-dns" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.238391 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="53825ae6-1e7b-469e-9264-f16520e57021" containerName="dnsmasq-dns" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.238944 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.245315 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-88r8l" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.245546 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.250011 4732 scope.go:117] "RemoveContainer" containerID="36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.250910 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x6n9t"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.381677 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-db-sync-config-data\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.381766 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlgfc\" (UniqueName: \"kubernetes.io/projected/122832c9-8a6a-48f4-988c-7c4de7dd085a-kube-api-access-jlgfc\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.381845 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-combined-ca-bundle\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.393834 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-q6pdc"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.395027 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.412861 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.413245 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fdw6m" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.413765 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.485324 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-combined-ca-bundle\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.485395 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-scripts\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.490729 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqq45\" (UniqueName: \"kubernetes.io/projected/b89ce220-623c-443f-93f4-4a960ffe29eb-kube-api-access-lqq45\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.490773 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-db-sync-config-data\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.490829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-config-data\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.490849 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlgfc\" (UniqueName: \"kubernetes.io/projected/122832c9-8a6a-48f4-988c-7c4de7dd085a-kube-api-access-jlgfc\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.490864 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b89ce220-623c-443f-93f4-4a960ffe29eb-etc-machine-id\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.490931 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-db-sync-config-data\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.491042 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-combined-ca-bundle\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.495478 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-q6pdc"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.522039 4732 scope.go:117] "RemoveContainer" containerID="b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.522557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-combined-ca-bundle\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.525230 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-db-sync-config-data\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: E1010 07:09:31.534506 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd\": container with ID starting with b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd not found: ID does not exist" containerID="b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.534553 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd"} err="failed to get container status \"b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd\": rpc error: code = NotFound desc = could not find container \"b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd\": container with ID starting with b53151c1752bfab9bd3d7de38a8e7964b22bb4b886922221234c6031edc5c3dd not found: ID does not exist" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.534584 4732 scope.go:117] "RemoveContainer" containerID="36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d" Oct 10 07:09:31 crc kubenswrapper[4732]: E1010 07:09:31.545042 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d\": container with ID starting with 36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d not found: ID does not exist" containerID="36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.545086 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d"} err="failed to get container status \"36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d\": rpc error: code = NotFound desc = could not find container \"36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d\": container with ID starting with 36dad88a657198c61410a63f0f200533990e5aeb68584e26c7cc21233827269d not found: ID does not exist" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.567798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlgfc\" (UniqueName: \"kubernetes.io/projected/122832c9-8a6a-48f4-988c-7c4de7dd085a-kube-api-access-jlgfc\") pod \"barbican-db-sync-x6n9t\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.587785 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-655549c8f-jpr8k"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.599677 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-combined-ca-bundle\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.599744 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-scripts\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.599780 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqq45\" (UniqueName: \"kubernetes.io/projected/b89ce220-623c-443f-93f4-4a960ffe29eb-kube-api-access-lqq45\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.599813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-config-data\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.599828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b89ce220-623c-443f-93f4-4a960ffe29eb-etc-machine-id\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.599866 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-db-sync-config-data\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.648526 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-scripts\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.648993 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-db-sync-config-data\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.649455 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-combined-ca-bundle\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.651643 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.651925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-config-data\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.652370 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b89ce220-623c-443f-93f4-4a960ffe29eb-etc-machine-id\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.658274 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqq45\" (UniqueName: \"kubernetes.io/projected/b89ce220-623c-443f-93f4-4a960ffe29eb-kube-api-access-lqq45\") pod \"cinder-db-sync-q6pdc\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.746437 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.752890 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-655549c8f-jpr8k"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.805927 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.806663 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.814467 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.821884 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.897874 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-9sdb9"] Oct 10 07:09:31 crc kubenswrapper[4732]: E1010 07:09:31.898335 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" containerName="init" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.898352 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" containerName="init" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.898682 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" containerName="init" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.899385 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.903882 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9sdb9"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.907324 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.907574 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.908199 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pcm7s" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.961877 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-nb\") pod \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.961956 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxwxd\" (UniqueName: \"kubernetes.io/projected/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-kube-api-access-dxwxd\") pod \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.961981 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-svc\") pod \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.962050 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-swift-storage-0\") pod \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.962114 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-sb\") pod \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.962143 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-config\") pod \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\" (UID: \"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c\") " Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.962407 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwj4\" (UniqueName: \"kubernetes.io/projected/d4e63d44-6624-462c-9bbd-c6a160083bd0-kube-api-access-dqwj4\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.962466 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-combined-ca-bundle\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.962533 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-config\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.963261 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:09:31 crc kubenswrapper[4732]: I1010 07:09:31.975847 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-kube-api-access-dxwxd" (OuterVolumeSpecName: "kube-api-access-dxwxd") pod "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" (UID: "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c"). InnerVolumeSpecName "kube-api-access-dxwxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.003633 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" (UID: "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.004207 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" (UID: "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.019599 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-config" (OuterVolumeSpecName: "config") pod "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" (UID: "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.026573 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" (UID: "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.028365 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" (UID: "b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064591 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwj4\" (UniqueName: \"kubernetes.io/projected/d4e63d44-6624-462c-9bbd-c6a160083bd0-kube-api-access-dqwj4\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064675 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-combined-ca-bundle\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064753 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-config\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064851 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064864 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064873 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064882 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxwxd\" (UniqueName: \"kubernetes.io/projected/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-kube-api-access-dxwxd\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064892 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.064901 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.069151 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-combined-ca-bundle\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.069343 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-config\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.083827 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwj4\" (UniqueName: \"kubernetes.io/projected/d4e63d44-6624-462c-9bbd-c6a160083bd0-kube-api-access-dqwj4\") pod \"neutron-db-sync-9sdb9\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.117560 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:32 crc kubenswrapper[4732]: W1010 07:09:32.137068 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f83b91b_33d1_44fa_9e1c_999b804e51c7.slice/crio-0d37a711bc22c7327bbcd1cbd3831e32bd7f03a0d63049aaaae7b226cd723229 WatchSource:0}: Error finding container 0d37a711bc22c7327bbcd1cbd3831e32bd7f03a0d63049aaaae7b226cd723229: Status 404 returned error can't find the container with id 0d37a711bc22c7327bbcd1cbd3831e32bd7f03a0d63049aaaae7b226cd723229 Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.168253 4732 generic.go:334] "Generic (PLEG): container finished" podID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerID="56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63" exitCode=0 Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.168338 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94456597-gz79x" event={"ID":"f5d0e89e-6480-4747-a693-9f3cac1fb87d","Type":"ContainerDied","Data":"56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63"} Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.173145 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c9dbf1d-b64e-496f-ad78-593db301c457","Type":"ContainerStarted","Data":"b963cd5d9eb6cfb3893b7ad0908078d07be52568354d744aa0676699524b5320"} Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.177386 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f83b91b-33d1-44fa-9e1c-999b804e51c7","Type":"ContainerStarted","Data":"0d37a711bc22c7327bbcd1cbd3831e32bd7f03a0d63049aaaae7b226cd723229"} Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.187437 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.187737 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c8b784dc-sbvpb" event={"ID":"b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c","Type":"ContainerDied","Data":"0728a86a1fa7c5d39d95f1dbab9883a717e0a37007cb1f3ff3bb7eb3b2616b39"} Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.187805 4732 scope.go:117] "RemoveContainer" containerID="7c4bf2f39bae28b55efdc8966ac8c233f8dedcdefb6c8c7fef65ab3a01f01439" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.259419 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c8b784dc-sbvpb"] Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.262376 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.297030 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79c8b784dc-sbvpb"] Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.311029 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-x6n9t"] Oct 10 07:09:32 crc kubenswrapper[4732]: W1010 07:09:32.392028 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod122832c9_8a6a_48f4_988c_7c4de7dd085a.slice/crio-c8079ee127d6b381615ff90e02d67d9a3d3fd33a3d093819e2b01f4e7fa520ff WatchSource:0}: Error finding container c8079ee127d6b381615ff90e02d67d9a3d3fd33a3d093819e2b01f4e7fa520ff: Status 404 returned error can't find the container with id c8079ee127d6b381615ff90e02d67d9a3d3fd33a3d093819e2b01f4e7fa520ff Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.415309 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-q6pdc"] Oct 10 07:09:32 crc kubenswrapper[4732]: I1010 07:09:32.843436 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-9sdb9"] Oct 10 07:09:32 crc kubenswrapper[4732]: W1010 07:09:32.857078 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4e63d44_6624_462c_9bbd_c6a160083bd0.slice/crio-b418214eaa048dc56f7da039e34d486407f9f4a1859396591210262c617e9e4c WatchSource:0}: Error finding container b418214eaa048dc56f7da039e34d486407f9f4a1859396591210262c617e9e4c: Status 404 returned error can't find the container with id b418214eaa048dc56f7da039e34d486407f9f4a1859396591210262c617e9e4c Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.243031 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9sdb9" event={"ID":"d4e63d44-6624-462c-9bbd-c6a160083bd0","Type":"ContainerStarted","Data":"f8c61dd771da3d41b95b01e1da13533f302a1c15793a69ab61292b755af3bd72"} Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.243379 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9sdb9" event={"ID":"d4e63d44-6624-462c-9bbd-c6a160083bd0","Type":"ContainerStarted","Data":"b418214eaa048dc56f7da039e34d486407f9f4a1859396591210262c617e9e4c"} Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.245299 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6pdc" event={"ID":"b89ce220-623c-443f-93f4-4a960ffe29eb","Type":"ContainerStarted","Data":"e2a8782261250cfe89c6ac0e912d56695f00120aa1f12ae40b2b0593166df54c"} Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.257479 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94456597-gz79x" event={"ID":"f5d0e89e-6480-4747-a693-9f3cac1fb87d","Type":"ContainerStarted","Data":"9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb"} Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.258502 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.274464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c9dbf1d-b64e-496f-ad78-593db301c457","Type":"ContainerStarted","Data":"226f05bc70e35cc14c6b6875aafd7f31d4579dbf6b3a1d9b35984434103fe691"} Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.278418 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-9sdb9" podStartSLOduration=2.278397674 podStartE2EDuration="2.278397674s" podCreationTimestamp="2025-10-10 07:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:33.263116518 +0000 UTC m=+1100.332707769" watchObservedRunningTime="2025-10-10 07:09:33.278397674 +0000 UTC m=+1100.347988915" Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.281519 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f83b91b-33d1-44fa-9e1c-999b804e51c7","Type":"ContainerStarted","Data":"5b5a2e8825b7af0d21790e84429d31b1675fc0464a77e0c9ac5e37478d850bdd"} Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.285497 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6n9t" event={"ID":"122832c9-8a6a-48f4-988c-7c4de7dd085a","Type":"ContainerStarted","Data":"c8079ee127d6b381615ff90e02d67d9a3d3fd33a3d093819e2b01f4e7fa520ff"} Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.295259 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d94456597-gz79x" podStartSLOduration=4.295243624 podStartE2EDuration="4.295243624s" podCreationTimestamp="2025-10-10 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:33.282109986 +0000 UTC m=+1100.351701237" watchObservedRunningTime="2025-10-10 07:09:33.295243624 +0000 UTC m=+1100.364834865" Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.680735 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53825ae6-1e7b-469e-9264-f16520e57021" path="/var/lib/kubelet/pods/53825ae6-1e7b-469e-9264-f16520e57021/volumes" Oct 10 07:09:33 crc kubenswrapper[4732]: I1010 07:09:33.681576 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c" path="/var/lib/kubelet/pods/b5ef83fd-31cf-4a1f-95db-ac840b8b4b8c/volumes" Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.299853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c9dbf1d-b64e-496f-ad78-593db301c457","Type":"ContainerStarted","Data":"bcfe962b9ecd9b122966c00fd2db8fe765915ae5fcf5b8f244c6d745d61d7533"} Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.300030 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-log" containerID="cri-o://226f05bc70e35cc14c6b6875aafd7f31d4579dbf6b3a1d9b35984434103fe691" gracePeriod=30 Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.300666 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-httpd" containerID="cri-o://bcfe962b9ecd9b122966c00fd2db8fe765915ae5fcf5b8f244c6d745d61d7533" gracePeriod=30 Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.304767 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-log" containerID="cri-o://5b5a2e8825b7af0d21790e84429d31b1675fc0464a77e0c9ac5e37478d850bdd" gracePeriod=30 Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.305056 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f83b91b-33d1-44fa-9e1c-999b804e51c7","Type":"ContainerStarted","Data":"f20041c0e5826c45d46c4530e31036f891700aec63a41b808d36099efc526a79"} Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.305376 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-httpd" containerID="cri-o://f20041c0e5826c45d46c4530e31036f891700aec63a41b808d36099efc526a79" gracePeriod=30 Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.327409 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.327388009 podStartE2EDuration="5.327388009s" podCreationTimestamp="2025-10-10 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:34.323112972 +0000 UTC m=+1101.392704213" watchObservedRunningTime="2025-10-10 07:09:34.327388009 +0000 UTC m=+1101.396979250" Oct 10 07:09:34 crc kubenswrapper[4732]: I1010 07:09:34.364956 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.364912322 podStartE2EDuration="5.364912322s" podCreationTimestamp="2025-10-10 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:34.348133604 +0000 UTC m=+1101.417724865" watchObservedRunningTime="2025-10-10 07:09:34.364912322 +0000 UTC m=+1101.434503553" Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.318601 4732 generic.go:334] "Generic (PLEG): container finished" podID="94c15080-3c41-4889-9266-977e7c858d18" containerID="b767fdf343e2f054e157561352aab0e3261a07fc8d390dbd234c4209ebbdc59a" exitCode=0 Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.318683 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jjrv" event={"ID":"94c15080-3c41-4889-9266-977e7c858d18","Type":"ContainerDied","Data":"b767fdf343e2f054e157561352aab0e3261a07fc8d390dbd234c4209ebbdc59a"} Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.321834 4732 generic.go:334] "Generic (PLEG): container finished" podID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerID="bcfe962b9ecd9b122966c00fd2db8fe765915ae5fcf5b8f244c6d745d61d7533" exitCode=0 Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.321873 4732 generic.go:334] "Generic (PLEG): container finished" podID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerID="226f05bc70e35cc14c6b6875aafd7f31d4579dbf6b3a1d9b35984434103fe691" exitCode=143 Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.321912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c9dbf1d-b64e-496f-ad78-593db301c457","Type":"ContainerDied","Data":"bcfe962b9ecd9b122966c00fd2db8fe765915ae5fcf5b8f244c6d745d61d7533"} Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.322136 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c9dbf1d-b64e-496f-ad78-593db301c457","Type":"ContainerDied","Data":"226f05bc70e35cc14c6b6875aafd7f31d4579dbf6b3a1d9b35984434103fe691"} Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.331360 4732 generic.go:334] "Generic (PLEG): container finished" podID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerID="f20041c0e5826c45d46c4530e31036f891700aec63a41b808d36099efc526a79" exitCode=0 Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.331395 4732 generic.go:334] "Generic (PLEG): container finished" podID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerID="5b5a2e8825b7af0d21790e84429d31b1675fc0464a77e0c9ac5e37478d850bdd" exitCode=143 Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.331441 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f83b91b-33d1-44fa-9e1c-999b804e51c7","Type":"ContainerDied","Data":"f20041c0e5826c45d46c4530e31036f891700aec63a41b808d36099efc526a79"} Oct 10 07:09:35 crc kubenswrapper[4732]: I1010 07:09:35.331485 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f83b91b-33d1-44fa-9e1c-999b804e51c7","Type":"ContainerDied","Data":"5b5a2e8825b7af0d21790e84429d31b1675fc0464a77e0c9ac5e37478d850bdd"} Oct 10 07:09:40 crc kubenswrapper[4732]: I1010 07:09:40.146823 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:09:40 crc kubenswrapper[4732]: I1010 07:09:40.208586 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-ktcjd"] Oct 10 07:09:40 crc kubenswrapper[4732]: I1010 07:09:40.209033 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="dnsmasq-dns" containerID="cri-o://aaff43b96fa7dcd4402138faf31aab86679b10bf84ebc281590018b708b6cb13" gracePeriod=10 Oct 10 07:09:40 crc kubenswrapper[4732]: I1010 07:09:40.383218 4732 generic.go:334] "Generic (PLEG): container finished" podID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerID="aaff43b96fa7dcd4402138faf31aab86679b10bf84ebc281590018b708b6cb13" exitCode=0 Oct 10 07:09:40 crc kubenswrapper[4732]: I1010 07:09:40.383283 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" event={"ID":"b8ffdec9-7aa5-4ae8-b860-f3fad859308c","Type":"ContainerDied","Data":"aaff43b96fa7dcd4402138faf31aab86679b10bf84ebc281590018b708b6cb13"} Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.009396 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.082627 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.155469 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-config-data\") pod \"94c15080-3c41-4889-9266-977e7c858d18\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.156567 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-fernet-keys\") pod \"94c15080-3c41-4889-9266-977e7c858d18\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.157422 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn8sm\" (UniqueName: \"kubernetes.io/projected/94c15080-3c41-4889-9266-977e7c858d18-kube-api-access-hn8sm\") pod \"94c15080-3c41-4889-9266-977e7c858d18\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.157540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-scripts\") pod \"94c15080-3c41-4889-9266-977e7c858d18\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.157847 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-credential-keys\") pod \"94c15080-3c41-4889-9266-977e7c858d18\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.157975 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-combined-ca-bundle\") pod \"94c15080-3c41-4889-9266-977e7c858d18\" (UID: \"94c15080-3c41-4889-9266-977e7c858d18\") " Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.161582 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c15080-3c41-4889-9266-977e7c858d18-kube-api-access-hn8sm" (OuterVolumeSpecName: "kube-api-access-hn8sm") pod "94c15080-3c41-4889-9266-977e7c858d18" (UID: "94c15080-3c41-4889-9266-977e7c858d18"). InnerVolumeSpecName "kube-api-access-hn8sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.162398 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-scripts" (OuterVolumeSpecName: "scripts") pod "94c15080-3c41-4889-9266-977e7c858d18" (UID: "94c15080-3c41-4889-9266-977e7c858d18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.164750 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "94c15080-3c41-4889-9266-977e7c858d18" (UID: "94c15080-3c41-4889-9266-977e7c858d18"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.173052 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "94c15080-3c41-4889-9266-977e7c858d18" (UID: "94c15080-3c41-4889-9266-977e7c858d18"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.186539 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94c15080-3c41-4889-9266-977e7c858d18" (UID: "94c15080-3c41-4889-9266-977e7c858d18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.188172 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-config-data" (OuterVolumeSpecName: "config-data") pod "94c15080-3c41-4889-9266-977e7c858d18" (UID: "94c15080-3c41-4889-9266-977e7c858d18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.260717 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.260777 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.260789 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.260798 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.260810 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn8sm\" (UniqueName: \"kubernetes.io/projected/94c15080-3c41-4889-9266-977e7c858d18-kube-api-access-hn8sm\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.260823 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94c15080-3c41-4889-9266-977e7c858d18-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.396252 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6jjrv" event={"ID":"94c15080-3c41-4889-9266-977e7c858d18","Type":"ContainerDied","Data":"524b201636b8e44d887ec042cda3e0ecaa6d05c33fa060f001a3ae1076880b9d"} Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.396318 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524b201636b8e44d887ec042cda3e0ecaa6d05c33fa060f001a3ae1076880b9d" Oct 10 07:09:41 crc kubenswrapper[4732]: I1010 07:09:41.396356 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6jjrv" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.156503 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.199852 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6jjrv"] Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.211618 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6jjrv"] Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.277947 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-internal-tls-certs\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.278018 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-combined-ca-bundle\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.278078 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgnqm\" (UniqueName: \"kubernetes.io/projected/5f83b91b-33d1-44fa-9e1c-999b804e51c7-kube-api-access-qgnqm\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.278109 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-logs\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.278154 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-httpd-run\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.278244 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-scripts\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.278726 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.278829 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-config-data\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.281705 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-logs" (OuterVolumeSpecName: "logs") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.282593 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\" (UID: \"5f83b91b-33d1-44fa-9e1c-999b804e51c7\") " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.283340 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.283594 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f83b91b-33d1-44fa-9e1c-999b804e51c7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.300277 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-scripts" (OuterVolumeSpecName: "scripts") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.300354 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.301329 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f83b91b-33d1-44fa-9e1c-999b804e51c7-kube-api-access-qgnqm" (OuterVolumeSpecName: "kube-api-access-qgnqm") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "kube-api-access-qgnqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.307885 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mz64q"] Oct 10 07:09:42 crc kubenswrapper[4732]: E1010 07:09:42.308229 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-httpd" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.308244 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-httpd" Oct 10 07:09:42 crc kubenswrapper[4732]: E1010 07:09:42.308260 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c15080-3c41-4889-9266-977e7c858d18" containerName="keystone-bootstrap" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.308267 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c15080-3c41-4889-9266-977e7c858d18" containerName="keystone-bootstrap" Oct 10 07:09:42 crc kubenswrapper[4732]: E1010 07:09:42.308294 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-log" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.308300 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-log" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.308446 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c15080-3c41-4889-9266-977e7c858d18" containerName="keystone-bootstrap" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.308460 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-httpd" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.308473 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" containerName="glance-log" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.308992 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.310584 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.312527 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.313064 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8gnjq" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.315822 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.316098 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.335846 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mz64q"] Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.368706 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-config-data" (OuterVolumeSpecName: "config-data") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386020 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-config-data\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386095 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-fernet-keys\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386379 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-combined-ca-bundle\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386468 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df67g\" (UniqueName: \"kubernetes.io/projected/55cc47b2-b0ca-4234-b5da-2779e1210367-kube-api-access-df67g\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386515 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-credential-keys\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-scripts\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386728 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386767 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgnqm\" (UniqueName: \"kubernetes.io/projected/5f83b91b-33d1-44fa-9e1c-999b804e51c7-kube-api-access-qgnqm\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386780 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386794 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.386841 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.409669 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f83b91b-33d1-44fa-9e1c-999b804e51c7","Type":"ContainerDied","Data":"0d37a711bc22c7327bbcd1cbd3831e32bd7f03a0d63049aaaae7b226cd723229"} Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.409746 4732 scope.go:117] "RemoveContainer" containerID="f20041c0e5826c45d46c4530e31036f891700aec63a41b808d36099efc526a79" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.409855 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.413199 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.426880 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5f83b91b-33d1-44fa-9e1c-999b804e51c7" (UID: "5f83b91b-33d1-44fa-9e1c-999b804e51c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489048 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-combined-ca-bundle\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df67g\" (UniqueName: \"kubernetes.io/projected/55cc47b2-b0ca-4234-b5da-2779e1210367-kube-api-access-df67g\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489166 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-credential-keys\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489196 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-scripts\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489297 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-config-data\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489348 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-fernet-keys\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489403 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f83b91b-33d1-44fa-9e1c-999b804e51c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.489419 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.492460 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-scripts\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.492536 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-combined-ca-bundle\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.493161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-fernet-keys\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.494090 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-credential-keys\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.494165 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-config-data\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.507424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df67g\" (UniqueName: \"kubernetes.io/projected/55cc47b2-b0ca-4234-b5da-2779e1210367-kube-api-access-df67g\") pod \"keystone-bootstrap-mz64q\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.732575 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.744291 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.755437 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.761177 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.762592 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.765386 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.765664 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.781506 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.894653 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.894763 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.894816 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.894862 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.894893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.895114 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.895185 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-logs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.895223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4cqs\" (UniqueName: \"kubernetes.io/projected/e300892d-faa1-4ba1-8095-04539bc33e27-kube-api-access-v4cqs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997346 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997447 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997481 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-logs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997499 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4cqs\" (UniqueName: \"kubernetes.io/projected/e300892d-faa1-4ba1-8095-04539bc33e27-kube-api-access-v4cqs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997524 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997609 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.997645 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.998363 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.998815 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-logs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:42 crc kubenswrapper[4732]: I1010 07:09:42.999072 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.002197 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.003661 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.006410 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.009399 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.023593 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4cqs\" (UniqueName: \"kubernetes.io/projected/e300892d-faa1-4ba1-8095-04539bc33e27-kube-api-access-v4cqs\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.035878 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.089491 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.671831 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f83b91b-33d1-44fa-9e1c-999b804e51c7" path="/var/lib/kubelet/pods/5f83b91b-33d1-44fa-9e1c-999b804e51c7/volumes" Oct 10 07:09:43 crc kubenswrapper[4732]: I1010 07:09:43.672839 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c15080-3c41-4889-9266-977e7c858d18" path="/var/lib/kubelet/pods/94c15080-3c41-4889-9266-977e7c858d18/volumes" Oct 10 07:09:51 crc kubenswrapper[4732]: I1010 07:09:51.083548 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: i/o timeout" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.277779 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.287351 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385615 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-nb\") pod \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385676 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-scripts\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385721 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-logs\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385767 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-dns-svc\") pod \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385812 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-config\") pod \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385863 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-config-data\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385904 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-combined-ca-bundle\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385958 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-public-tls-certs\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.385980 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-httpd-run\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.386039 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-sb\") pod \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.386064 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnhkl\" (UniqueName: \"kubernetes.io/projected/0c9dbf1d-b64e-496f-ad78-593db301c457-kube-api-access-nnhkl\") pod \"0c9dbf1d-b64e-496f-ad78-593db301c457\" (UID: \"0c9dbf1d-b64e-496f-ad78-593db301c457\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.386101 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ppdz\" (UniqueName: \"kubernetes.io/projected/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-kube-api-access-5ppdz\") pod \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\" (UID: \"b8ffdec9-7aa5-4ae8-b860-f3fad859308c\") " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.386535 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.387092 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-logs" (OuterVolumeSpecName: "logs") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.391149 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c9dbf1d-b64e-496f-ad78-593db301c457-kube-api-access-nnhkl" (OuterVolumeSpecName: "kube-api-access-nnhkl") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "kube-api-access-nnhkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.391785 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-kube-api-access-5ppdz" (OuterVolumeSpecName: "kube-api-access-5ppdz") pod "b8ffdec9-7aa5-4ae8-b860-f3fad859308c" (UID: "b8ffdec9-7aa5-4ae8-b860-f3fad859308c"). InnerVolumeSpecName "kube-api-access-5ppdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.396121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-scripts" (OuterVolumeSpecName: "scripts") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.405539 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.411341 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.430619 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8ffdec9-7aa5-4ae8-b860-f3fad859308c" (UID: "b8ffdec9-7aa5-4ae8-b860-f3fad859308c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.433381 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8ffdec9-7aa5-4ae8-b860-f3fad859308c" (UID: "b8ffdec9-7aa5-4ae8-b860-f3fad859308c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.433953 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.437561 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b8ffdec9-7aa5-4ae8-b860-f3fad859308c" (UID: "b8ffdec9-7aa5-4ae8-b860-f3fad859308c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.440101 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-config-data" (OuterVolumeSpecName: "config-data") pod "0c9dbf1d-b64e-496f-ad78-593db301c457" (UID: "0c9dbf1d-b64e-496f-ad78-593db301c457"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.452306 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-config" (OuterVolumeSpecName: "config") pod "b8ffdec9-7aa5-4ae8-b860-f3fad859308c" (UID: "b8ffdec9-7aa5-4ae8-b860-f3fad859308c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488296 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488327 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488339 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488367 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488377 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488386 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488394 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488403 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnhkl\" (UniqueName: \"kubernetes.io/projected/0c9dbf1d-b64e-496f-ad78-593db301c457-kube-api-access-nnhkl\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488412 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ppdz\" (UniqueName: \"kubernetes.io/projected/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-kube-api-access-5ppdz\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488420 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488428 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c9dbf1d-b64e-496f-ad78-593db301c457-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488437 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c9dbf1d-b64e-496f-ad78-593db301c457-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.488445 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8ffdec9-7aa5-4ae8-b860-f3fad859308c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.501943 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.501957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0c9dbf1d-b64e-496f-ad78-593db301c457","Type":"ContainerDied","Data":"b963cd5d9eb6cfb3893b7ad0908078d07be52568354d744aa0676699524b5320"} Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.505650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" event={"ID":"b8ffdec9-7aa5-4ae8-b860-f3fad859308c","Type":"ContainerDied","Data":"3729c87082f24e5168376c3a6a520f06edacfcd72beee8adee185d338d12e5dd"} Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.505751 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.507505 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.561494 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.561538 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.586054 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-ktcjd"] Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.591176 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.598796 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:53 crc kubenswrapper[4732]: E1010 07:09:53.599182 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-httpd" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.599194 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-httpd" Oct 10 07:09:53 crc kubenswrapper[4732]: E1010 07:09:53.599203 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="dnsmasq-dns" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.599209 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="dnsmasq-dns" Oct 10 07:09:53 crc kubenswrapper[4732]: E1010 07:09:53.599240 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-log" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.599248 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-log" Oct 10 07:09:53 crc kubenswrapper[4732]: E1010 07:09:53.599277 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="init" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.599286 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="init" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.599528 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="dnsmasq-dns" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.599545 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-httpd" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.599568 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" containerName="glance-log" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.600581 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.602606 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.602857 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.607464 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57f58c7cff-ktcjd"] Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.614000 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.675813 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c9dbf1d-b64e-496f-ad78-593db301c457" path="/var/lib/kubelet/pods/0c9dbf1d-b64e-496f-ad78-593db301c457/volumes" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.676547 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" path="/var/lib/kubelet/pods/b8ffdec9-7aa5-4ae8-b860-f3fad859308c/volumes" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.692951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-config-data\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.693011 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjlfr\" (UniqueName: \"kubernetes.io/projected/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-kube-api-access-gjlfr\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.693177 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.693220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.693307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-logs\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.693366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-scripts\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.693433 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.693466 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.797837 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.797940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798070 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-config-data\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjlfr\" (UniqueName: \"kubernetes.io/projected/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-kube-api-access-gjlfr\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798451 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798489 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798673 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-logs\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798765 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-scripts\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798848 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.798703 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.799470 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-logs\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.801934 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.802051 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.802095 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-scripts\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.802886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-config-data\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.816935 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjlfr\" (UniqueName: \"kubernetes.io/projected/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-kube-api-access-gjlfr\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.826764 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " pod="openstack/glance-default-external-api-0" Oct 10 07:09:53 crc kubenswrapper[4732]: I1010 07:09:53.919872 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:09:54 crc kubenswrapper[4732]: I1010 07:09:54.409788 4732 scope.go:117] "RemoveContainer" containerID="5b5a2e8825b7af0d21790e84429d31b1675fc0464a77e0c9ac5e37478d850bdd" Oct 10 07:09:54 crc kubenswrapper[4732]: E1010 07:09:54.422624 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f" Oct 10 07:09:54 crc kubenswrapper[4732]: E1010 07:09:54.422975 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqq45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-q6pdc_openstack(b89ce220-623c-443f-93f4-4a960ffe29eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 07:09:54 crc kubenswrapper[4732]: E1010 07:09:54.424213 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-q6pdc" podUID="b89ce220-623c-443f-93f4-4a960ffe29eb" Oct 10 07:09:54 crc kubenswrapper[4732]: E1010 07:09:54.551673 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:85c75d60e1bd2f8a9ea0a2bb21a8df64c0a6f7b504cc1a05a355981d4b90e92f\\\"\"" pod="openstack/cinder-db-sync-q6pdc" podUID="b89ce220-623c-443f-93f4-4a960ffe29eb" Oct 10 07:09:54 crc kubenswrapper[4732]: I1010 07:09:54.572797 4732 scope.go:117] "RemoveContainer" containerID="bcfe962b9ecd9b122966c00fd2db8fe765915ae5fcf5b8f244c6d745d61d7533" Oct 10 07:09:54 crc kubenswrapper[4732]: I1010 07:09:54.645531 4732 scope.go:117] "RemoveContainer" containerID="226f05bc70e35cc14c6b6875aafd7f31d4579dbf6b3a1d9b35984434103fe691" Oct 10 07:09:54 crc kubenswrapper[4732]: I1010 07:09:54.727037 4732 scope.go:117] "RemoveContainer" containerID="aaff43b96fa7dcd4402138faf31aab86679b10bf84ebc281590018b708b6cb13" Oct 10 07:09:54 crc kubenswrapper[4732]: I1010 07:09:54.790769 4732 scope.go:117] "RemoveContainer" containerID="4f39667823d73599e823341b3ce2ee66d4b6f95631b4a7222a0056f73811dbf8" Oct 10 07:09:54 crc kubenswrapper[4732]: I1010 07:09:54.926605 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mz64q"] Oct 10 07:09:54 crc kubenswrapper[4732]: W1010 07:09:54.936062 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55cc47b2_b0ca_4234_b5da_2779e1210367.slice/crio-d3d85f3ced41b00412b3c694f2290c232399350d70079ecc024db4922bac0164 WatchSource:0}: Error finding container d3d85f3ced41b00412b3c694f2290c232399350d70079ecc024db4922bac0164: Status 404 returned error can't find the container with id d3d85f3ced41b00412b3c694f2290c232399350d70079ecc024db4922bac0164 Oct 10 07:09:55 crc kubenswrapper[4732]: W1010 07:09:55.348485 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15fceeba_43c8_485c_a3f9_a4a1d5c74ff9.slice/crio-0423956a5b53d32f84a4c44e34d606b9f504f3b37c897a6b1e3cb6f852f7ad59 WatchSource:0}: Error finding container 0423956a5b53d32f84a4c44e34d606b9f504f3b37c897a6b1e3cb6f852f7ad59: Status 404 returned error can't find the container with id 0423956a5b53d32f84a4c44e34d606b9f504f3b37c897a6b1e3cb6f852f7ad59 Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.355407 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.355469 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.355519 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.356337 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"239755fce5a5e3e2f7099222145032c91288a72b835ff1a7f25dd0d9f8c8d6b0"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.356414 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://239755fce5a5e3e2f7099222145032c91288a72b835ff1a7f25dd0d9f8c8d6b0" gracePeriod=600 Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.363493 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.550320 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9","Type":"ContainerStarted","Data":"0423956a5b53d32f84a4c44e34d606b9f504f3b37c897a6b1e3cb6f852f7ad59"} Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.553477 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mz64q" event={"ID":"55cc47b2-b0ca-4234-b5da-2779e1210367","Type":"ContainerStarted","Data":"d1b9c9ec1328c0bb6e1e23ba7b0563f2efb58e6dd7800267dc005eab16e1c685"} Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.553818 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mz64q" event={"ID":"55cc47b2-b0ca-4234-b5da-2779e1210367","Type":"ContainerStarted","Data":"d3d85f3ced41b00412b3c694f2290c232399350d70079ecc024db4922bac0164"} Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.557511 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerStarted","Data":"2179c865f9814250bd4f2d617e85dbe0f1e81cb9fda3265c86a32d30a2a34b79"} Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.560979 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6n9t" event={"ID":"122832c9-8a6a-48f4-988c-7c4de7dd085a","Type":"ContainerStarted","Data":"a75822e5a435154e34f2342407e93653c250150aa62c0761fe0af6d275499cd7"} Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.562492 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cx6h2" event={"ID":"e60af77c-522d-441f-9174-a0242edc0361","Type":"ContainerStarted","Data":"62b7b16cc6a1f4bc0fab4fef610ce804f0be794fe762edd6193d6881fc145b47"} Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.573725 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mz64q" podStartSLOduration=13.573682437 podStartE2EDuration="13.573682437s" podCreationTimestamp="2025-10-10 07:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:55.567284123 +0000 UTC m=+1122.636875364" watchObservedRunningTime="2025-10-10 07:09:55.573682437 +0000 UTC m=+1122.643273688" Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.576570 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="239755fce5a5e3e2f7099222145032c91288a72b835ff1a7f25dd0d9f8c8d6b0" exitCode=0 Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.576618 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"239755fce5a5e3e2f7099222145032c91288a72b835ff1a7f25dd0d9f8c8d6b0"} Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.576651 4732 scope.go:117] "RemoveContainer" containerID="91a09d30e877ca33916c77e5509ae9d7f46220996d3447c91784df288ae8e1b0" Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.596749 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cx6h2" podStartSLOduration=4.094129457 podStartE2EDuration="26.596679054s" podCreationTimestamp="2025-10-10 07:09:29 +0000 UTC" firstStartedPulling="2025-10-10 07:09:30.651283109 +0000 UTC m=+1097.720874350" lastFinishedPulling="2025-10-10 07:09:53.153832706 +0000 UTC m=+1120.223423947" observedRunningTime="2025-10-10 07:09:55.593767745 +0000 UTC m=+1122.663359006" watchObservedRunningTime="2025-10-10 07:09:55.596679054 +0000 UTC m=+1122.666270295" Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.630424 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-x6n9t" podStartSLOduration=2.6227658529999998 podStartE2EDuration="24.630129486s" podCreationTimestamp="2025-10-10 07:09:31 +0000 UTC" firstStartedPulling="2025-10-10 07:09:32.430669369 +0000 UTC m=+1099.500260610" lastFinishedPulling="2025-10-10 07:09:54.438033002 +0000 UTC m=+1121.507624243" observedRunningTime="2025-10-10 07:09:55.607837608 +0000 UTC m=+1122.677428869" watchObservedRunningTime="2025-10-10 07:09:55.630129486 +0000 UTC m=+1122.699720727" Oct 10 07:09:55 crc kubenswrapper[4732]: I1010 07:09:55.944245 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:09:56 crc kubenswrapper[4732]: I1010 07:09:56.084464 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57f58c7cff-ktcjd" podUID="b8ffdec9-7aa5-4ae8-b860-f3fad859308c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: i/o timeout" Oct 10 07:09:56 crc kubenswrapper[4732]: I1010 07:09:56.614323 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e300892d-faa1-4ba1-8095-04539bc33e27","Type":"ContainerStarted","Data":"3369f97e9beba140b2e3d0aec0a446b524437c050aff712b8f90b3872dfc7076"} Oct 10 07:09:56 crc kubenswrapper[4732]: I1010 07:09:56.623442 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"58650635f1fbb6c2c9e22b572de1a4b4db4e63148a8b451b4d739ea558750b87"} Oct 10 07:09:56 crc kubenswrapper[4732]: I1010 07:09:56.629073 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9","Type":"ContainerStarted","Data":"fa2fe5ffad08eda01609d40a03daf2d04b8e09e8fdd5eedf039c44ae70aee915"} Oct 10 07:09:57 crc kubenswrapper[4732]: I1010 07:09:57.637683 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerStarted","Data":"78ed76c2c1f4af26198c1385456af18bf92248dd6b0384fea2893b66333c5509"} Oct 10 07:09:57 crc kubenswrapper[4732]: I1010 07:09:57.639530 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e300892d-faa1-4ba1-8095-04539bc33e27","Type":"ContainerStarted","Data":"b7464fe56625fd8bdabbebfebe13747ad27ebcc9839bed3c455e68309f1b3a7d"} Oct 10 07:09:57 crc kubenswrapper[4732]: I1010 07:09:57.643051 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9","Type":"ContainerStarted","Data":"3ecbe09d14b061b333b0a3c30f7012202070c01d6d13e5aeab8d8547767af628"} Oct 10 07:09:58 crc kubenswrapper[4732]: I1010 07:09:58.653985 4732 generic.go:334] "Generic (PLEG): container finished" podID="55cc47b2-b0ca-4234-b5da-2779e1210367" containerID="d1b9c9ec1328c0bb6e1e23ba7b0563f2efb58e6dd7800267dc005eab16e1c685" exitCode=0 Oct 10 07:09:58 crc kubenswrapper[4732]: I1010 07:09:58.654255 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mz64q" event={"ID":"55cc47b2-b0ca-4234-b5da-2779e1210367","Type":"ContainerDied","Data":"d1b9c9ec1328c0bb6e1e23ba7b0563f2efb58e6dd7800267dc005eab16e1c685"} Oct 10 07:09:58 crc kubenswrapper[4732]: I1010 07:09:58.663605 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e300892d-faa1-4ba1-8095-04539bc33e27","Type":"ContainerStarted","Data":"f33cbdea2122f73beec7de26dbafdccd714f4862f702da77f5ed82dd6b03e431"} Oct 10 07:09:58 crc kubenswrapper[4732]: I1010 07:09:58.679973 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.67994892 podStartE2EDuration="5.67994892s" podCreationTimestamp="2025-10-10 07:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:57.662203688 +0000 UTC m=+1124.731794939" watchObservedRunningTime="2025-10-10 07:09:58.67994892 +0000 UTC m=+1125.749540161" Oct 10 07:09:58 crc kubenswrapper[4732]: I1010 07:09:58.706090 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.706069192 podStartE2EDuration="16.706069192s" podCreationTimestamp="2025-10-10 07:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:09:58.693615632 +0000 UTC m=+1125.763206873" watchObservedRunningTime="2025-10-10 07:09:58.706069192 +0000 UTC m=+1125.775660433" Oct 10 07:09:59 crc kubenswrapper[4732]: I1010 07:09:59.672565 4732 generic.go:334] "Generic (PLEG): container finished" podID="e60af77c-522d-441f-9174-a0242edc0361" containerID="62b7b16cc6a1f4bc0fab4fef610ce804f0be794fe762edd6193d6881fc145b47" exitCode=0 Oct 10 07:09:59 crc kubenswrapper[4732]: I1010 07:09:59.672800 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cx6h2" event={"ID":"e60af77c-522d-441f-9174-a0242edc0361","Type":"ContainerDied","Data":"62b7b16cc6a1f4bc0fab4fef610ce804f0be794fe762edd6193d6881fc145b47"} Oct 10 07:10:00 crc kubenswrapper[4732]: I1010 07:10:00.685667 4732 generic.go:334] "Generic (PLEG): container finished" podID="122832c9-8a6a-48f4-988c-7c4de7dd085a" containerID="a75822e5a435154e34f2342407e93653c250150aa62c0761fe0af6d275499cd7" exitCode=0 Oct 10 07:10:00 crc kubenswrapper[4732]: I1010 07:10:00.685781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6n9t" event={"ID":"122832c9-8a6a-48f4-988c-7c4de7dd085a","Type":"ContainerDied","Data":"a75822e5a435154e34f2342407e93653c250150aa62c0761fe0af6d275499cd7"} Oct 10 07:10:01 crc kubenswrapper[4732]: I1010 07:10:01.974091 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:10:01 crc kubenswrapper[4732]: I1010 07:10:01.985775 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cx6h2" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056040 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-scripts\") pod \"e60af77c-522d-441f-9174-a0242edc0361\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056088 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-scripts\") pod \"55cc47b2-b0ca-4234-b5da-2779e1210367\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056116 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-combined-ca-bundle\") pod \"e60af77c-522d-441f-9174-a0242edc0361\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056153 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-credential-keys\") pod \"55cc47b2-b0ca-4234-b5da-2779e1210367\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056179 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-combined-ca-bundle\") pod \"55cc47b2-b0ca-4234-b5da-2779e1210367\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056225 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-config-data\") pod \"55cc47b2-b0ca-4234-b5da-2779e1210367\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056277 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-fernet-keys\") pod \"55cc47b2-b0ca-4234-b5da-2779e1210367\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056292 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df67g\" (UniqueName: \"kubernetes.io/projected/55cc47b2-b0ca-4234-b5da-2779e1210367-kube-api-access-df67g\") pod \"55cc47b2-b0ca-4234-b5da-2779e1210367\" (UID: \"55cc47b2-b0ca-4234-b5da-2779e1210367\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056313 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km56t\" (UniqueName: \"kubernetes.io/projected/e60af77c-522d-441f-9174-a0242edc0361-kube-api-access-km56t\") pod \"e60af77c-522d-441f-9174-a0242edc0361\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056339 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-config-data\") pod \"e60af77c-522d-441f-9174-a0242edc0361\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056372 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e60af77c-522d-441f-9174-a0242edc0361-logs\") pod \"e60af77c-522d-441f-9174-a0242edc0361\" (UID: \"e60af77c-522d-441f-9174-a0242edc0361\") " Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.056966 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e60af77c-522d-441f-9174-a0242edc0361-logs" (OuterVolumeSpecName: "logs") pod "e60af77c-522d-441f-9174-a0242edc0361" (UID: "e60af77c-522d-441f-9174-a0242edc0361"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.062826 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-scripts" (OuterVolumeSpecName: "scripts") pod "e60af77c-522d-441f-9174-a0242edc0361" (UID: "e60af77c-522d-441f-9174-a0242edc0361"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.063228 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60af77c-522d-441f-9174-a0242edc0361-kube-api-access-km56t" (OuterVolumeSpecName: "kube-api-access-km56t") pod "e60af77c-522d-441f-9174-a0242edc0361" (UID: "e60af77c-522d-441f-9174-a0242edc0361"). InnerVolumeSpecName "kube-api-access-km56t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.063715 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-scripts" (OuterVolumeSpecName: "scripts") pod "55cc47b2-b0ca-4234-b5da-2779e1210367" (UID: "55cc47b2-b0ca-4234-b5da-2779e1210367"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.064486 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "55cc47b2-b0ca-4234-b5da-2779e1210367" (UID: "55cc47b2-b0ca-4234-b5da-2779e1210367"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.064825 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cc47b2-b0ca-4234-b5da-2779e1210367-kube-api-access-df67g" (OuterVolumeSpecName: "kube-api-access-df67g") pod "55cc47b2-b0ca-4234-b5da-2779e1210367" (UID: "55cc47b2-b0ca-4234-b5da-2779e1210367"). InnerVolumeSpecName "kube-api-access-df67g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.064878 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "55cc47b2-b0ca-4234-b5da-2779e1210367" (UID: "55cc47b2-b0ca-4234-b5da-2779e1210367"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.085558 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-config-data" (OuterVolumeSpecName: "config-data") pod "55cc47b2-b0ca-4234-b5da-2779e1210367" (UID: "55cc47b2-b0ca-4234-b5da-2779e1210367"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.090102 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55cc47b2-b0ca-4234-b5da-2779e1210367" (UID: "55cc47b2-b0ca-4234-b5da-2779e1210367"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.091020 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-config-data" (OuterVolumeSpecName: "config-data") pod "e60af77c-522d-441f-9174-a0242edc0361" (UID: "e60af77c-522d-441f-9174-a0242edc0361"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.091962 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e60af77c-522d-441f-9174-a0242edc0361" (UID: "e60af77c-522d-441f-9174-a0242edc0361"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167036 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167079 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167092 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df67g\" (UniqueName: \"kubernetes.io/projected/55cc47b2-b0ca-4234-b5da-2779e1210367-kube-api-access-df67g\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167106 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km56t\" (UniqueName: \"kubernetes.io/projected/e60af77c-522d-441f-9174-a0242edc0361-kube-api-access-km56t\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167117 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167129 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e60af77c-522d-441f-9174-a0242edc0361-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167139 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167149 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167159 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e60af77c-522d-441f-9174-a0242edc0361-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167169 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.167179 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cc47b2-b0ca-4234-b5da-2779e1210367-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.714119 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cx6h2" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.714137 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cx6h2" event={"ID":"e60af77c-522d-441f-9174-a0242edc0361","Type":"ContainerDied","Data":"adde8706f52beb2b80ad945572a617167cc6c00e1977c5cdf85386673e7b0049"} Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.714199 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adde8706f52beb2b80ad945572a617167cc6c00e1977c5cdf85386673e7b0049" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.717137 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mz64q" event={"ID":"55cc47b2-b0ca-4234-b5da-2779e1210367","Type":"ContainerDied","Data":"d3d85f3ced41b00412b3c694f2290c232399350d70079ecc024db4922bac0164"} Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.717190 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d85f3ced41b00412b3c694f2290c232399350d70079ecc024db4922bac0164" Oct 10 07:10:02 crc kubenswrapper[4732]: I1010 07:10:02.717260 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mz64q" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.086671 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8c7d5b696-rkhkz"] Oct 10 07:10:03 crc kubenswrapper[4732]: E1010 07:10:03.087140 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cc47b2-b0ca-4234-b5da-2779e1210367" containerName="keystone-bootstrap" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.087159 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cc47b2-b0ca-4234-b5da-2779e1210367" containerName="keystone-bootstrap" Oct 10 07:10:03 crc kubenswrapper[4732]: E1010 07:10:03.087195 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60af77c-522d-441f-9174-a0242edc0361" containerName="placement-db-sync" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.087204 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60af77c-522d-441f-9174-a0242edc0361" containerName="placement-db-sync" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.087438 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cc47b2-b0ca-4234-b5da-2779e1210367" containerName="keystone-bootstrap" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.087455 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60af77c-522d-441f-9174-a0242edc0361" containerName="placement-db-sync" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.088214 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.090137 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.090317 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.090353 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.094555 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.094671 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.094877 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-8gnjq" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.094917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.095039 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.106208 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8c7d5b696-rkhkz"] Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.141947 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.170005 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182383 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-public-tls-certs\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182460 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-config-data\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182508 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-internal-tls-certs\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182533 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-fernet-keys\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-credential-keys\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182592 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-scripts\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182633 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.182770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pglx\" (UniqueName: \"kubernetes.io/projected/b93e689a-691a-403b-970f-63547469bbfe-kube-api-access-4pglx\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.198210 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5d445dfc98-wk5w4"] Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.200050 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.210446 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.210784 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.210934 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x8v5s" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.211518 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.214509 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.218933 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d445dfc98-wk5w4"] Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285441 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-config-data\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285500 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pglx\" (UniqueName: \"kubernetes.io/projected/b93e689a-691a-403b-970f-63547469bbfe-kube-api-access-4pglx\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285533 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-scripts\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285568 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e37998e-491a-43b8-abda-4bdfea233217-logs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285591 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-public-tls-certs\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285710 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-config-data\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285763 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-public-tls-certs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-internal-tls-certs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285866 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-internal-tls-certs\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285903 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-fernet-keys\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-credential-keys\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.285990 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-scripts\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.286030 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-combined-ca-bundle\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.286056 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78lrr\" (UniqueName: \"kubernetes.io/projected/3e37998e-491a-43b8-abda-4bdfea233217-kube-api-access-78lrr\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.286133 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.289462 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-public-tls-certs\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.289747 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-internal-tls-certs\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.290112 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-scripts\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.290468 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-config-data\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.290041 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-credential-keys\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.296354 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-fernet-keys\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.303345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.304781 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pglx\" (UniqueName: \"kubernetes.io/projected/b93e689a-691a-403b-970f-63547469bbfe-kube-api-access-4pglx\") pod \"keystone-8c7d5b696-rkhkz\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.387895 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-config-data\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.387959 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-scripts\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.388006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e37998e-491a-43b8-abda-4bdfea233217-logs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.388077 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-public-tls-certs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.388112 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-internal-tls-certs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.388149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-combined-ca-bundle\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.388169 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78lrr\" (UniqueName: \"kubernetes.io/projected/3e37998e-491a-43b8-abda-4bdfea233217-kube-api-access-78lrr\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.388912 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e37998e-491a-43b8-abda-4bdfea233217-logs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.391904 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-public-tls-certs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.392655 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-internal-tls-certs\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.393026 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-scripts\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.393277 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-config-data\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.393340 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-combined-ca-bundle\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.408551 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78lrr\" (UniqueName: \"kubernetes.io/projected/3e37998e-491a-43b8-abda-4bdfea233217-kube-api-access-78lrr\") pod \"placement-5d445dfc98-wk5w4\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.417632 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.529098 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.697721 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.739766 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-x6n9t" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.739908 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-x6n9t" event={"ID":"122832c9-8a6a-48f4-988c-7c4de7dd085a","Type":"ContainerDied","Data":"c8079ee127d6b381615ff90e02d67d9a3d3fd33a3d093819e2b01f4e7fa520ff"} Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.739929 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8079ee127d6b381615ff90e02d67d9a3d3fd33a3d093819e2b01f4e7fa520ff" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.739946 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.740121 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.795762 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlgfc\" (UniqueName: \"kubernetes.io/projected/122832c9-8a6a-48f4-988c-7c4de7dd085a-kube-api-access-jlgfc\") pod \"122832c9-8a6a-48f4-988c-7c4de7dd085a\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.797084 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-db-sync-config-data\") pod \"122832c9-8a6a-48f4-988c-7c4de7dd085a\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.797235 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-combined-ca-bundle\") pod \"122832c9-8a6a-48f4-988c-7c4de7dd085a\" (UID: \"122832c9-8a6a-48f4-988c-7c4de7dd085a\") " Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.805718 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "122832c9-8a6a-48f4-988c-7c4de7dd085a" (UID: "122832c9-8a6a-48f4-988c-7c4de7dd085a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.805790 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122832c9-8a6a-48f4-988c-7c4de7dd085a-kube-api-access-jlgfc" (OuterVolumeSpecName: "kube-api-access-jlgfc") pod "122832c9-8a6a-48f4-988c-7c4de7dd085a" (UID: "122832c9-8a6a-48f4-988c-7c4de7dd085a"). InnerVolumeSpecName "kube-api-access-jlgfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.854521 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "122832c9-8a6a-48f4-988c-7c4de7dd085a" (UID: "122832c9-8a6a-48f4-988c-7c4de7dd085a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.903267 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlgfc\" (UniqueName: \"kubernetes.io/projected/122832c9-8a6a-48f4-988c-7c4de7dd085a-kube-api-access-jlgfc\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.903297 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.903309 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122832c9-8a6a-48f4-988c-7c4de7dd085a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.920141 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.921139 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.962879 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 07:10:03 crc kubenswrapper[4732]: I1010 07:10:03.979091 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.200025 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8c7d5b696-rkhkz"] Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.309878 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d445dfc98-wk5w4"] Oct 10 07:10:04 crc kubenswrapper[4732]: W1010 07:10:04.314323 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e37998e_491a_43b8_abda_4bdfea233217.slice/crio-645c9cd4a1dd0b79ccd8c0bbbd6ad0325c5870e48176f2d49707cdd64e3a78e3 WatchSource:0}: Error finding container 645c9cd4a1dd0b79ccd8c0bbbd6ad0325c5870e48176f2d49707cdd64e3a78e3: Status 404 returned error can't find the container with id 645c9cd4a1dd0b79ccd8c0bbbd6ad0325c5870e48176f2d49707cdd64e3a78e3 Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.748835 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8c7d5b696-rkhkz" event={"ID":"b93e689a-691a-403b-970f-63547469bbfe","Type":"ContainerStarted","Data":"e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1"} Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.749186 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8c7d5b696-rkhkz" event={"ID":"b93e689a-691a-403b-970f-63547469bbfe","Type":"ContainerStarted","Data":"c125154cabcdfe5999b14b69e6aec32e2bc02f073f7c1b37d5c7e96e975bcddc"} Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.749210 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.751410 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerStarted","Data":"8606c9bfff9cd2a4cba16dfe78c4da2b07535c0b9700f29019aa987ce5e57be7"} Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.753363 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d445dfc98-wk5w4" event={"ID":"3e37998e-491a-43b8-abda-4bdfea233217","Type":"ContainerStarted","Data":"ba7a1f03f18ae86234997ab8ec3532045109f6ac2550d2fdf25633eb2d62be0a"} Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.753406 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d445dfc98-wk5w4" event={"ID":"3e37998e-491a-43b8-abda-4bdfea233217","Type":"ContainerStarted","Data":"9e5587596a7f6545f5ee41c7fe004abf66a409e2bfb223d64c2066b916dae202"} Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.753420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d445dfc98-wk5w4" event={"ID":"3e37998e-491a-43b8-abda-4bdfea233217","Type":"ContainerStarted","Data":"645c9cd4a1dd0b79ccd8c0bbbd6ad0325c5870e48176f2d49707cdd64e3a78e3"} Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.753557 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.753580 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.767615 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8c7d5b696-rkhkz" podStartSLOduration=1.767600293 podStartE2EDuration="1.767600293s" podCreationTimestamp="2025-10-10 07:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:04.763565213 +0000 UTC m=+1131.833156454" watchObservedRunningTime="2025-10-10 07:10:04.767600293 +0000 UTC m=+1131.837191524" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.792234 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5d445dfc98-wk5w4" podStartSLOduration=1.792214944 podStartE2EDuration="1.792214944s" podCreationTimestamp="2025-10-10 07:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:04.787033252 +0000 UTC m=+1131.856624503" watchObservedRunningTime="2025-10-10 07:10:04.792214944 +0000 UTC m=+1131.861806185" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.935358 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-59899c8879-prgpj"] Oct 10 07:10:04 crc kubenswrapper[4732]: E1010 07:10:04.935782 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122832c9-8a6a-48f4-988c-7c4de7dd085a" containerName="barbican-db-sync" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.935804 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="122832c9-8a6a-48f4-988c-7c4de7dd085a" containerName="barbican-db-sync" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.936046 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="122832c9-8a6a-48f4-988c-7c4de7dd085a" containerName="barbican-db-sync" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.937163 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.943867 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.944118 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.944955 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-88r8l" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.954747 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-59899c8879-prgpj"] Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.972768 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-69886dc6f8-sfv6d"] Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.974160 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:04 crc kubenswrapper[4732]: I1010 07:10:04.980126 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.024190 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69886dc6f8-sfv6d"] Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-combined-ca-bundle\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034372 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034395 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data-custom\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034416 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data-custom\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034432 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nlbd\" (UniqueName: \"kubernetes.io/projected/f5d96c35-c01e-4f12-ab12-7b6342789b2f-kube-api-access-7nlbd\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034471 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlcgg\" (UniqueName: \"kubernetes.io/projected/d0fa844d-f411-49a9-a52f-256760a71157-kube-api-access-hlcgg\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034500 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034517 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0fa844d-f411-49a9-a52f-256760a71157-logs\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034549 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-combined-ca-bundle\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.034576 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5d96c35-c01e-4f12-ab12-7b6342789b2f-logs\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.042778 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f666b5f5c-bjff2"] Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.044126 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.108554 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f666b5f5c-bjff2"] Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136712 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136780 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data-custom\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136807 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data-custom\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136833 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nlbd\" (UniqueName: \"kubernetes.io/projected/f5d96c35-c01e-4f12-ab12-7b6342789b2f-kube-api-access-7nlbd\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136902 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-svc\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136923 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlcgg\" (UniqueName: \"kubernetes.io/projected/d0fa844d-f411-49a9-a52f-256760a71157-kube-api-access-hlcgg\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.136993 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137013 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0fa844d-f411-49a9-a52f-256760a71157-logs\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137029 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b88b\" (UniqueName: \"kubernetes.io/projected/3dc026a1-d604-469a-b65d-8649b7e7984e-kube-api-access-6b88b\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137048 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137084 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-combined-ca-bundle\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137118 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137154 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5d96c35-c01e-4f12-ab12-7b6342789b2f-logs\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137176 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-config\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.137210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-combined-ca-bundle\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.138146 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0fa844d-f411-49a9-a52f-256760a71157-logs\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.138402 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5d96c35-c01e-4f12-ab12-7b6342789b2f-logs\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.144760 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data-custom\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.145683 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-combined-ca-bundle\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.146270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data-custom\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.146593 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-combined-ca-bundle\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.147413 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.147534 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.166997 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nlbd\" (UniqueName: \"kubernetes.io/projected/f5d96c35-c01e-4f12-ab12-7b6342789b2f-kube-api-access-7nlbd\") pod \"barbican-keystone-listener-69886dc6f8-sfv6d\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.170439 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlcgg\" (UniqueName: \"kubernetes.io/projected/d0fa844d-f411-49a9-a52f-256760a71157-kube-api-access-hlcgg\") pod \"barbican-worker-59899c8879-prgpj\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.213459 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5dff44767b-nqf2r"] Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.215245 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.219751 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.232315 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dff44767b-nqf2r"] Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.239020 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9deab2aa-5aca-4273-97de-da95bd0da4ab-logs\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.239200 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-svc\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.240218 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-svc\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.240265 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data-custom\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.240312 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.246950 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.247070 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg7pv\" (UniqueName: \"kubernetes.io/projected/9deab2aa-5aca-4273-97de-da95bd0da4ab-kube-api-access-pg7pv\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.247188 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.247342 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b88b\" (UniqueName: \"kubernetes.io/projected/3dc026a1-d604-469a-b65d-8649b7e7984e-kube-api-access-6b88b\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.247528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.247633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.247731 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-config\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.247760 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-combined-ca-bundle\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.261092 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.262080 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.270321 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-config\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.271201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.271675 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b88b\" (UniqueName: \"kubernetes.io/projected/3dc026a1-d604-469a-b65d-8649b7e7984e-kube-api-access-6b88b\") pod \"dnsmasq-dns-5f666b5f5c-bjff2\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.305138 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.349593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-combined-ca-bundle\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.349652 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9deab2aa-5aca-4273-97de-da95bd0da4ab-logs\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.349747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data-custom\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.349785 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg7pv\" (UniqueName: \"kubernetes.io/projected/9deab2aa-5aca-4273-97de-da95bd0da4ab-kube-api-access-pg7pv\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.349821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.350285 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9deab2aa-5aca-4273-97de-da95bd0da4ab-logs\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.360092 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-combined-ca-bundle\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.360670 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data-custom\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.363783 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.375452 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg7pv\" (UniqueName: \"kubernetes.io/projected/9deab2aa-5aca-4273-97de-da95bd0da4ab-kube-api-access-pg7pv\") pod \"barbican-api-5dff44767b-nqf2r\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.378302 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.542070 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.771805 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.772097 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.771992 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.773136 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.896768 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-69886dc6f8-sfv6d"] Oct 10 07:10:05 crc kubenswrapper[4732]: W1010 07:10:05.908786 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5d96c35_c01e_4f12_ab12_7b6342789b2f.slice/crio-5c84d8a53ef2654754a32aa5cd3d813906d216569388acec620ead3a4b4da84e WatchSource:0}: Error finding container 5c84d8a53ef2654754a32aa5cd3d813906d216569388acec620ead3a4b4da84e: Status 404 returned error can't find the container with id 5c84d8a53ef2654754a32aa5cd3d813906d216569388acec620ead3a4b4da84e Oct 10 07:10:05 crc kubenswrapper[4732]: I1010 07:10:05.988157 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f666b5f5c-bjff2"] Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.001918 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-59899c8879-prgpj"] Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.212917 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5dff44767b-nqf2r"] Oct 10 07:10:06 crc kubenswrapper[4732]: W1010 07:10:06.241698 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9deab2aa_5aca_4273_97de_da95bd0da4ab.slice/crio-171e52f14aba76308be98bfafe3975f022f1da4efba729fb02b9155106d02994 WatchSource:0}: Error finding container 171e52f14aba76308be98bfafe3975f022f1da4efba729fb02b9155106d02994: Status 404 returned error can't find the container with id 171e52f14aba76308be98bfafe3975f022f1da4efba729fb02b9155106d02994 Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.353883 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.357103 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.791915 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59899c8879-prgpj" event={"ID":"d0fa844d-f411-49a9-a52f-256760a71157","Type":"ContainerStarted","Data":"d25edfdabbfd8eb20706eb8c215c54f7b2f5b802298396949e349e2ee7370877"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.800072 4732 generic.go:334] "Generic (PLEG): container finished" podID="d4e63d44-6624-462c-9bbd-c6a160083bd0" containerID="f8c61dd771da3d41b95b01e1da13533f302a1c15793a69ab61292b755af3bd72" exitCode=0 Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.800576 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9sdb9" event={"ID":"d4e63d44-6624-462c-9bbd-c6a160083bd0","Type":"ContainerDied","Data":"f8c61dd771da3d41b95b01e1da13533f302a1c15793a69ab61292b755af3bd72"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.804551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6pdc" event={"ID":"b89ce220-623c-443f-93f4-4a960ffe29eb","Type":"ContainerStarted","Data":"c932149ea67159f9a84814065ec885afcf27d6ffe2cca727317cb446e3e1836a"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.807889 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerID="6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8" exitCode=0 Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.807972 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" event={"ID":"3dc026a1-d604-469a-b65d-8649b7e7984e","Type":"ContainerDied","Data":"6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.808000 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" event={"ID":"3dc026a1-d604-469a-b65d-8649b7e7984e","Type":"ContainerStarted","Data":"2340f296105d909a3cf56a878c60dff6deb5bd3da7e48b80f9dc770025d87a74"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.812218 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" event={"ID":"f5d96c35-c01e-4f12-ab12-7b6342789b2f","Type":"ContainerStarted","Data":"5c84d8a53ef2654754a32aa5cd3d813906d216569388acec620ead3a4b4da84e"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.821343 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff44767b-nqf2r" event={"ID":"9deab2aa-5aca-4273-97de-da95bd0da4ab","Type":"ContainerStarted","Data":"5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.821389 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff44767b-nqf2r" event={"ID":"9deab2aa-5aca-4273-97de-da95bd0da4ab","Type":"ContainerStarted","Data":"171e52f14aba76308be98bfafe3975f022f1da4efba729fb02b9155106d02994"} Oct 10 07:10:06 crc kubenswrapper[4732]: I1010 07:10:06.854842 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-q6pdc" podStartSLOduration=3.006512224 podStartE2EDuration="35.854825437s" podCreationTimestamp="2025-10-10 07:09:31 +0000 UTC" firstStartedPulling="2025-10-10 07:09:32.447517098 +0000 UTC m=+1099.517108339" lastFinishedPulling="2025-10-10 07:10:05.295830321 +0000 UTC m=+1132.365421552" observedRunningTime="2025-10-10 07:10:06.849450221 +0000 UTC m=+1133.919041462" watchObservedRunningTime="2025-10-10 07:10:06.854825437 +0000 UTC m=+1133.924416668" Oct 10 07:10:07 crc kubenswrapper[4732]: I1010 07:10:07.761276 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 07:10:07 crc kubenswrapper[4732]: I1010 07:10:07.761647 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 07:10:07 crc kubenswrapper[4732]: I1010 07:10:07.842029 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" event={"ID":"3dc026a1-d604-469a-b65d-8649b7e7984e","Type":"ContainerStarted","Data":"0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c"} Oct 10 07:10:07 crc kubenswrapper[4732]: I1010 07:10:07.842496 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:07 crc kubenswrapper[4732]: I1010 07:10:07.850520 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff44767b-nqf2r" event={"ID":"9deab2aa-5aca-4273-97de-da95bd0da4ab","Type":"ContainerStarted","Data":"101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42"} Oct 10 07:10:07 crc kubenswrapper[4732]: I1010 07:10:07.890121 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" podStartSLOduration=3.890102678 podStartE2EDuration="3.890102678s" podCreationTimestamp="2025-10-10 07:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:07.868481139 +0000 UTC m=+1134.938072400" watchObservedRunningTime="2025-10-10 07:10:07.890102678 +0000 UTC m=+1134.959693909" Oct 10 07:10:07 crc kubenswrapper[4732]: I1010 07:10:07.904285 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5dff44767b-nqf2r" podStartSLOduration=2.904263264 podStartE2EDuration="2.904263264s" podCreationTimestamp="2025-10-10 07:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:07.887560209 +0000 UTC m=+1134.957151470" watchObservedRunningTime="2025-10-10 07:10:07.904263264 +0000 UTC m=+1134.973854505" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.548738 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.628838 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-combined-ca-bundle\") pod \"d4e63d44-6624-462c-9bbd-c6a160083bd0\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.628914 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqwj4\" (UniqueName: \"kubernetes.io/projected/d4e63d44-6624-462c-9bbd-c6a160083bd0-kube-api-access-dqwj4\") pod \"d4e63d44-6624-462c-9bbd-c6a160083bd0\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.628995 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-config\") pod \"d4e63d44-6624-462c-9bbd-c6a160083bd0\" (UID: \"d4e63d44-6624-462c-9bbd-c6a160083bd0\") " Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.636724 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e63d44-6624-462c-9bbd-c6a160083bd0-kube-api-access-dqwj4" (OuterVolumeSpecName: "kube-api-access-dqwj4") pod "d4e63d44-6624-462c-9bbd-c6a160083bd0" (UID: "d4e63d44-6624-462c-9bbd-c6a160083bd0"). InnerVolumeSpecName "kube-api-access-dqwj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.663603 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-config" (OuterVolumeSpecName: "config") pod "d4e63d44-6624-462c-9bbd-c6a160083bd0" (UID: "d4e63d44-6624-462c-9bbd-c6a160083bd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.665217 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4e63d44-6624-462c-9bbd-c6a160083bd0" (UID: "d4e63d44-6624-462c-9bbd-c6a160083bd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.730823 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.730865 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e63d44-6624-462c-9bbd-c6a160083bd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.730878 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqwj4\" (UniqueName: \"kubernetes.io/projected/d4e63d44-6624-462c-9bbd-c6a160083bd0-kube-api-access-dqwj4\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.890114 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-9sdb9" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.890798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-9sdb9" event={"ID":"d4e63d44-6624-462c-9bbd-c6a160083bd0","Type":"ContainerDied","Data":"b418214eaa048dc56f7da039e34d486407f9f4a1859396591210262c617e9e4c"} Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.890845 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b418214eaa048dc56f7da039e34d486407f9f4a1859396591210262c617e9e4c" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.890869 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:08 crc kubenswrapper[4732]: I1010 07:10:08.890931 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.018712 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f666b5f5c-bjff2"] Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.032229 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7949456448-wncp2"] Oct 10 07:10:09 crc kubenswrapper[4732]: E1010 07:10:09.032618 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e63d44-6624-462c-9bbd-c6a160083bd0" containerName="neutron-db-sync" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.032631 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e63d44-6624-462c-9bbd-c6a160083bd0" containerName="neutron-db-sync" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.032885 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e63d44-6624-462c-9bbd-c6a160083bd0" containerName="neutron-db-sync" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.033863 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.037121 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.037671 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7949456448-wncp2"] Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.040039 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.101580 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8cd96d85-wz9jb"] Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.103795 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.124788 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8cd96d85-wz9jb"] Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139563 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139615 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-combined-ca-bundle\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139639 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-svc\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139704 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139721 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data-custom\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139742 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139769 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-logs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zc8m\" (UniqueName: \"kubernetes.io/projected/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-kube-api-access-9zc8m\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139805 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-public-tls-certs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139825 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9rl\" (UniqueName: \"kubernetes.io/projected/00875c9d-e02f-4325-8a8a-d354fdf4cd80-kube-api-access-9p9rl\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139847 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139867 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-config\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.139883 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-internal-tls-certs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.235775 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57f66cf84d-qjn76"] Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.237642 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.240399 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.240597 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.240727 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241115 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data-custom\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241183 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241219 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-logs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241245 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zc8m\" (UniqueName: \"kubernetes.io/projected/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-kube-api-access-9zc8m\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241265 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-public-tls-certs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241287 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9rl\" (UniqueName: \"kubernetes.io/projected/00875c9d-e02f-4325-8a8a-d354fdf4cd80-kube-api-access-9p9rl\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241310 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-config\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241346 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-internal-tls-certs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241394 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241425 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-combined-ca-bundle\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241443 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-svc\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.241721 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-logs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.244042 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.244359 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.245158 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.246094 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-svc\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.246156 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-config\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.246270 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pcm7s" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.247440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data-custom\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.249194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-public-tls-certs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.249373 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.249913 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-internal-tls-certs\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.252610 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57f66cf84d-qjn76"] Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.253220 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-combined-ca-bundle\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.267867 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zc8m\" (UniqueName: \"kubernetes.io/projected/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-kube-api-access-9zc8m\") pod \"barbican-api-7949456448-wncp2\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.268040 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9rl\" (UniqueName: \"kubernetes.io/projected/00875c9d-e02f-4325-8a8a-d354fdf4cd80-kube-api-access-9p9rl\") pod \"dnsmasq-dns-7b8cd96d85-wz9jb\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.343059 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-combined-ca-bundle\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.343110 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shsxk\" (UniqueName: \"kubernetes.io/projected/104e2934-13aa-441b-b330-be153b392e7f-kube-api-access-shsxk\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.343258 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-httpd-config\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.343409 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-config\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.343447 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-ovndb-tls-certs\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.357724 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.443916 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.446139 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-config\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.446229 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-ovndb-tls-certs\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.446310 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-combined-ca-bundle\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.446345 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shsxk\" (UniqueName: \"kubernetes.io/projected/104e2934-13aa-441b-b330-be153b392e7f-kube-api-access-shsxk\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.446430 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-httpd-config\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.450857 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-httpd-config\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.456104 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-ovndb-tls-certs\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.460536 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-combined-ca-bundle\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.461036 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-config\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.467167 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shsxk\" (UniqueName: \"kubernetes.io/projected/104e2934-13aa-441b-b330-be153b392e7f-kube-api-access-shsxk\") pod \"neutron-57f66cf84d-qjn76\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.601141 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.913721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59899c8879-prgpj" event={"ID":"d0fa844d-f411-49a9-a52f-256760a71157","Type":"ContainerStarted","Data":"a99618b8dab7c28ba86268863ff2d9ff67fee28ff3451930b3093109a33ec4fa"} Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.920722 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" podUID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerName="dnsmasq-dns" containerID="cri-o://0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c" gracePeriod=10 Oct 10 07:10:09 crc kubenswrapper[4732]: I1010 07:10:09.922871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" event={"ID":"f5d96c35-c01e-4f12-ab12-7b6342789b2f","Type":"ContainerStarted","Data":"0865053222fa0b5007b970a0a688d80dffb2106bd8ecb37be061b0d8aaf978cd"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.006556 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7949456448-wncp2"] Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.101956 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8cd96d85-wz9jb"] Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.371814 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57f66cf84d-qjn76"] Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.389439 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:10 crc kubenswrapper[4732]: W1010 07:10:10.390237 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod104e2934_13aa_441b_b330_be153b392e7f.slice/crio-46e8c2a21b467b2daedc279a5da3e9a3e039267f3c1b603e92eea776b9e41d3f WatchSource:0}: Error finding container 46e8c2a21b467b2daedc279a5da3e9a3e039267f3c1b603e92eea776b9e41d3f: Status 404 returned error can't find the container with id 46e8c2a21b467b2daedc279a5da3e9a3e039267f3c1b603e92eea776b9e41d3f Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.468006 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-nb\") pod \"3dc026a1-d604-469a-b65d-8649b7e7984e\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.468281 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-swift-storage-0\") pod \"3dc026a1-d604-469a-b65d-8649b7e7984e\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.468306 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b88b\" (UniqueName: \"kubernetes.io/projected/3dc026a1-d604-469a-b65d-8649b7e7984e-kube-api-access-6b88b\") pod \"3dc026a1-d604-469a-b65d-8649b7e7984e\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.468358 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-svc\") pod \"3dc026a1-d604-469a-b65d-8649b7e7984e\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.468479 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-config\") pod \"3dc026a1-d604-469a-b65d-8649b7e7984e\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.468494 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-sb\") pod \"3dc026a1-d604-469a-b65d-8649b7e7984e\" (UID: \"3dc026a1-d604-469a-b65d-8649b7e7984e\") " Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.476180 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc026a1-d604-469a-b65d-8649b7e7984e-kube-api-access-6b88b" (OuterVolumeSpecName: "kube-api-access-6b88b") pod "3dc026a1-d604-469a-b65d-8649b7e7984e" (UID: "3dc026a1-d604-469a-b65d-8649b7e7984e"). InnerVolumeSpecName "kube-api-access-6b88b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.529028 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3dc026a1-d604-469a-b65d-8649b7e7984e" (UID: "3dc026a1-d604-469a-b65d-8649b7e7984e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.535763 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dc026a1-d604-469a-b65d-8649b7e7984e" (UID: "3dc026a1-d604-469a-b65d-8649b7e7984e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.543246 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dc026a1-d604-469a-b65d-8649b7e7984e" (UID: "3dc026a1-d604-469a-b65d-8649b7e7984e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.558562 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-config" (OuterVolumeSpecName: "config") pod "3dc026a1-d604-469a-b65d-8649b7e7984e" (UID: "3dc026a1-d604-469a-b65d-8649b7e7984e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.562364 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dc026a1-d604-469a-b65d-8649b7e7984e" (UID: "3dc026a1-d604-469a-b65d-8649b7e7984e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.570614 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.570644 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.570654 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.570663 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.570672 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b88b\" (UniqueName: \"kubernetes.io/projected/3dc026a1-d604-469a-b65d-8649b7e7984e-kube-api-access-6b88b\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.570681 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dc026a1-d604-469a-b65d-8649b7e7984e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.936019 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59899c8879-prgpj" event={"ID":"d0fa844d-f411-49a9-a52f-256760a71157","Type":"ContainerStarted","Data":"ef9057491152a5767b996f9aa867ebd6dd43e2419bbb3e1338a179c119d1dd11"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.962907 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7949456448-wncp2" event={"ID":"2960d902-25b0-4fb8-baa7-fe7f9d4f5811","Type":"ContainerStarted","Data":"d77b0880ffd05c296384bd3f19b5b8b3ab8e7f54824859cab96f74dddf1fd9e2"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.962959 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7949456448-wncp2" event={"ID":"2960d902-25b0-4fb8-baa7-fe7f9d4f5811","Type":"ContainerStarted","Data":"37340908e0e97c3a729ef1965ebc6960980d4f8695b1143d55d26a38eea03ce7"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.962974 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7949456448-wncp2" event={"ID":"2960d902-25b0-4fb8-baa7-fe7f9d4f5811","Type":"ContainerStarted","Data":"82d576aaa00b31274d28aa29364ab9bbd5df67563209ffa142c0c03adbc3f704"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.962989 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.963009 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.969515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f66cf84d-qjn76" event={"ID":"104e2934-13aa-441b-b330-be153b392e7f","Type":"ContainerStarted","Data":"25c855b0584ceb27b3166ed7bf66900e6657bef7a1203215356f7e860120abe3"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.969556 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f66cf84d-qjn76" event={"ID":"104e2934-13aa-441b-b330-be153b392e7f","Type":"ContainerStarted","Data":"46e8c2a21b467b2daedc279a5da3e9a3e039267f3c1b603e92eea776b9e41d3f"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.972019 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerID="0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c" exitCode=0 Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.972078 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" event={"ID":"3dc026a1-d604-469a-b65d-8649b7e7984e","Type":"ContainerDied","Data":"0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.972100 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" event={"ID":"3dc026a1-d604-469a-b65d-8649b7e7984e","Type":"ContainerDied","Data":"2340f296105d909a3cf56a878c60dff6deb5bd3da7e48b80f9dc770025d87a74"} Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.972120 4732 scope.go:117] "RemoveContainer" containerID="0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.972264 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f666b5f5c-bjff2" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.991054 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-59899c8879-prgpj" podStartSLOduration=3.518297783 podStartE2EDuration="6.991037286s" podCreationTimestamp="2025-10-10 07:10:04 +0000 UTC" firstStartedPulling="2025-10-10 07:10:06.0210637 +0000 UTC m=+1133.090654941" lastFinishedPulling="2025-10-10 07:10:09.493803203 +0000 UTC m=+1136.563394444" observedRunningTime="2025-10-10 07:10:10.958105248 +0000 UTC m=+1138.027696519" watchObservedRunningTime="2025-10-10 07:10:10.991037286 +0000 UTC m=+1138.060628527" Oct 10 07:10:10 crc kubenswrapper[4732]: I1010 07:10:10.997235 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7949456448-wncp2" podStartSLOduration=2.997220154 podStartE2EDuration="2.997220154s" podCreationTimestamp="2025-10-10 07:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:10.985910946 +0000 UTC m=+1138.055502207" watchObservedRunningTime="2025-10-10 07:10:10.997220154 +0000 UTC m=+1138.066811395" Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.000439 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" event={"ID":"f5d96c35-c01e-4f12-ab12-7b6342789b2f","Type":"ContainerStarted","Data":"3854a4792dd684bc5e205f322924b756bc947446fb50e6b4d2c49c8df807513b"} Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.001979 4732 generic.go:334] "Generic (PLEG): container finished" podID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerID="c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a" exitCode=0 Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.002455 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" event={"ID":"00875c9d-e02f-4325-8a8a-d354fdf4cd80","Type":"ContainerDied","Data":"c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a"} Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.002500 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" event={"ID":"00875c9d-e02f-4325-8a8a-d354fdf4cd80","Type":"ContainerStarted","Data":"bad124d34071258086cd11f21cd9f6fa3fa95943c9e57caac0be4631adcf1a0f"} Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.012020 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f666b5f5c-bjff2"] Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.025020 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f666b5f5c-bjff2"] Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.031948 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" podStartSLOduration=3.4459786709999998 podStartE2EDuration="7.03192201s" podCreationTimestamp="2025-10-10 07:10:04 +0000 UTC" firstStartedPulling="2025-10-10 07:10:05.926863892 +0000 UTC m=+1132.996455143" lastFinishedPulling="2025-10-10 07:10:09.512807241 +0000 UTC m=+1136.582398482" observedRunningTime="2025-10-10 07:10:11.02529843 +0000 UTC m=+1138.094889681" watchObservedRunningTime="2025-10-10 07:10:11.03192201 +0000 UTC m=+1138.101513251" Oct 10 07:10:11 crc kubenswrapper[4732]: I1010 07:10:11.706846 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc026a1-d604-469a-b65d-8649b7e7984e" path="/var/lib/kubelet/pods/3dc026a1-d604-469a-b65d-8649b7e7984e/volumes" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.040216 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-785547cb47-x77nc"] Oct 10 07:10:12 crc kubenswrapper[4732]: E1010 07:10:12.040666 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerName="init" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.040680 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerName="init" Oct 10 07:10:12 crc kubenswrapper[4732]: E1010 07:10:12.040743 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerName="dnsmasq-dns" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.040753 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerName="dnsmasq-dns" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.040961 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc026a1-d604-469a-b65d-8649b7e7984e" containerName="dnsmasq-dns" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.043298 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.053363 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.056903 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.067871 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-785547cb47-x77nc"] Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.110195 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-ovndb-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.110246 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-combined-ca-bundle\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.110269 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-public-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.110328 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqnpx\" (UniqueName: \"kubernetes.io/projected/eb94a64c-1a0c-4a61-bb69-e843b627cf35-kube-api-access-lqnpx\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.110351 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-internal-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.110384 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-httpd-config\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.110402 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-config\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.212278 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-ovndb-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.212335 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-combined-ca-bundle\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.212354 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-public-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.212392 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqnpx\" (UniqueName: \"kubernetes.io/projected/eb94a64c-1a0c-4a61-bb69-e843b627cf35-kube-api-access-lqnpx\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.212414 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-internal-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.212434 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-httpd-config\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.212451 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-config\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.229249 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-internal-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.232418 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-httpd-config\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.233756 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqnpx\" (UniqueName: \"kubernetes.io/projected/eb94a64c-1a0c-4a61-bb69-e843b627cf35-kube-api-access-lqnpx\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.234058 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-config\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.234422 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-ovndb-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.235156 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-public-tls-certs\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.241355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-combined-ca-bundle\") pod \"neutron-785547cb47-x77nc\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.368606 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.872162 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:12 crc kubenswrapper[4732]: I1010 07:10:12.990905 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:13 crc kubenswrapper[4732]: I1010 07:10:13.045456 4732 generic.go:334] "Generic (PLEG): container finished" podID="b89ce220-623c-443f-93f4-4a960ffe29eb" containerID="c932149ea67159f9a84814065ec885afcf27d6ffe2cca727317cb446e3e1836a" exitCode=0 Oct 10 07:10:13 crc kubenswrapper[4732]: I1010 07:10:13.045550 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6pdc" event={"ID":"b89ce220-623c-443f-93f4-4a960ffe29eb","Type":"ContainerDied","Data":"c932149ea67159f9a84814065ec885afcf27d6ffe2cca727317cb446e3e1836a"} Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.612415 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.686764 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-combined-ca-bundle\") pod \"b89ce220-623c-443f-93f4-4a960ffe29eb\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.686846 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-db-sync-config-data\") pod \"b89ce220-623c-443f-93f4-4a960ffe29eb\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.686882 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqq45\" (UniqueName: \"kubernetes.io/projected/b89ce220-623c-443f-93f4-4a960ffe29eb-kube-api-access-lqq45\") pod \"b89ce220-623c-443f-93f4-4a960ffe29eb\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.686964 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-scripts\") pod \"b89ce220-623c-443f-93f4-4a960ffe29eb\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.687014 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-config-data\") pod \"b89ce220-623c-443f-93f4-4a960ffe29eb\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.687033 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b89ce220-623c-443f-93f4-4a960ffe29eb-etc-machine-id\") pod \"b89ce220-623c-443f-93f4-4a960ffe29eb\" (UID: \"b89ce220-623c-443f-93f4-4a960ffe29eb\") " Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.687584 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b89ce220-623c-443f-93f4-4a960ffe29eb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b89ce220-623c-443f-93f4-4a960ffe29eb" (UID: "b89ce220-623c-443f-93f4-4a960ffe29eb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.696394 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b89ce220-623c-443f-93f4-4a960ffe29eb" (UID: "b89ce220-623c-443f-93f4-4a960ffe29eb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.702835 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89ce220-623c-443f-93f4-4a960ffe29eb-kube-api-access-lqq45" (OuterVolumeSpecName: "kube-api-access-lqq45") pod "b89ce220-623c-443f-93f4-4a960ffe29eb" (UID: "b89ce220-623c-443f-93f4-4a960ffe29eb"). InnerVolumeSpecName "kube-api-access-lqq45". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.702832 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-scripts" (OuterVolumeSpecName: "scripts") pod "b89ce220-623c-443f-93f4-4a960ffe29eb" (UID: "b89ce220-623c-443f-93f4-4a960ffe29eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.715440 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b89ce220-623c-443f-93f4-4a960ffe29eb" (UID: "b89ce220-623c-443f-93f4-4a960ffe29eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.743499 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-config-data" (OuterVolumeSpecName: "config-data") pod "b89ce220-623c-443f-93f4-4a960ffe29eb" (UID: "b89ce220-623c-443f-93f4-4a960ffe29eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.789822 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.789858 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.789872 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqq45\" (UniqueName: \"kubernetes.io/projected/b89ce220-623c-443f-93f4-4a960ffe29eb-kube-api-access-lqq45\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.789884 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.789897 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b89ce220-623c-443f-93f4-4a960ffe29eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:15 crc kubenswrapper[4732]: I1010 07:10:15.789908 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b89ce220-623c-443f-93f4-4a960ffe29eb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.080952 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-q6pdc" event={"ID":"b89ce220-623c-443f-93f4-4a960ffe29eb","Type":"ContainerDied","Data":"e2a8782261250cfe89c6ac0e912d56695f00120aa1f12ae40b2b0593166df54c"} Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.081264 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2a8782261250cfe89c6ac0e912d56695f00120aa1f12ae40b2b0593166df54c" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.081011 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-q6pdc" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.910044 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:16 crc kubenswrapper[4732]: E1010 07:10:16.910525 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89ce220-623c-443f-93f4-4a960ffe29eb" containerName="cinder-db-sync" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.910543 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89ce220-623c-443f-93f4-4a960ffe29eb" containerName="cinder-db-sync" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.913197 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89ce220-623c-443f-93f4-4a960ffe29eb" containerName="cinder-db-sync" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.914473 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.919842 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fdw6m" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.920064 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.920200 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.937497 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:16 crc kubenswrapper[4732]: I1010 07:10:16.942361 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.011876 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.011933 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.011970 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65lhf\" (UniqueName: \"kubernetes.io/projected/5214162b-4dac-4e5d-bddc-19d4a50dc78f-kube-api-access-65lhf\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.012033 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.012076 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5214162b-4dac-4e5d-bddc-19d4a50dc78f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.012199 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.017046 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8cd96d85-wz9jb"] Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.089788 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd7dbbffc-c2s4k"] Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.091959 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.113677 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.113956 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.113975 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65lhf\" (UniqueName: \"kubernetes.io/projected/5214162b-4dac-4e5d-bddc-19d4a50dc78f-kube-api-access-65lhf\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114028 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54wpv\" (UniqueName: \"kubernetes.io/projected/39f93b6c-c094-4275-ade1-5b4b2d95143c-kube-api-access-54wpv\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114060 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114080 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5214162b-4dac-4e5d-bddc-19d4a50dc78f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114107 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114158 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114194 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-svc\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114232 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-config\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114300 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.114870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5214162b-4dac-4e5d-bddc-19d4a50dc78f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.127275 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.129045 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-scripts\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.136507 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd7dbbffc-c2s4k"] Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.141897 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.142711 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.173369 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65lhf\" (UniqueName: \"kubernetes.io/projected/5214162b-4dac-4e5d-bddc-19d4a50dc78f-kube-api-access-65lhf\") pod \"cinder-scheduler-0\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.218832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54wpv\" (UniqueName: \"kubernetes.io/projected/39f93b6c-c094-4275-ade1-5b4b2d95143c-kube-api-access-54wpv\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.218928 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.218979 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-svc\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.219024 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-config\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.219078 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.219120 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.220298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.221271 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.222100 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-svc\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.222776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-config\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.223415 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.246314 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.251601 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54wpv\" (UniqueName: \"kubernetes.io/projected/39f93b6c-c094-4275-ade1-5b4b2d95143c-kube-api-access-54wpv\") pod \"dnsmasq-dns-5fd7dbbffc-c2s4k\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.316598 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.361924 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.363816 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.367652 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.372876 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.423752 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.424111 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.424316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxcbj\" (UniqueName: \"kubernetes.io/projected/e980e794-2380-4c1e-ae4d-9a27fb5338f0-kube-api-access-lxcbj\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.424429 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e980e794-2380-4c1e-ae4d-9a27fb5338f0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.424528 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e980e794-2380-4c1e-ae4d-9a27fb5338f0-logs\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.424716 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.424839 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-scripts\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.482707 4732 scope.go:117] "RemoveContainer" containerID="6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.526280 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.526347 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.526412 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxcbj\" (UniqueName: \"kubernetes.io/projected/e980e794-2380-4c1e-ae4d-9a27fb5338f0-kube-api-access-lxcbj\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.526439 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e980e794-2380-4c1e-ae4d-9a27fb5338f0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.526462 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e980e794-2380-4c1e-ae4d-9a27fb5338f0-logs\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.526513 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.526551 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-scripts\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.527236 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e980e794-2380-4c1e-ae4d-9a27fb5338f0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.527478 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e980e794-2380-4c1e-ae4d-9a27fb5338f0-logs\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.531349 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.537438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data-custom\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.537922 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.538088 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-scripts\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.550584 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxcbj\" (UniqueName: \"kubernetes.io/projected/e980e794-2380-4c1e-ae4d-9a27fb5338f0-kube-api-access-lxcbj\") pod \"cinder-api-0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.685431 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.713914 4732 scope.go:117] "RemoveContainer" containerID="0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c" Oct 10 07:10:17 crc kubenswrapper[4732]: E1010 07:10:17.714511 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c\": container with ID starting with 0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c not found: ID does not exist" containerID="0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.714557 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c"} err="failed to get container status \"0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c\": rpc error: code = NotFound desc = could not find container \"0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c\": container with ID starting with 0400519e6e1d9501a2e10e865d9f8837b82575ab61675288d7320639e55ad35c not found: ID does not exist" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.714584 4732 scope.go:117] "RemoveContainer" containerID="6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8" Oct 10 07:10:17 crc kubenswrapper[4732]: E1010 07:10:17.714930 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8\": container with ID starting with 6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8 not found: ID does not exist" containerID="6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8" Oct 10 07:10:17 crc kubenswrapper[4732]: I1010 07:10:17.714951 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8"} err="failed to get container status \"6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8\": rpc error: code = NotFound desc = could not find container \"6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8\": container with ID starting with 6974277e0abb7b67f6fa9bc664c8fc85095aa4fc156cfde00284d8e9040a0ca8 not found: ID does not exist" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.074947 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-785547cb47-x77nc"] Oct 10 07:10:18 crc kubenswrapper[4732]: W1010 07:10:18.091254 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb94a64c_1a0c_4a61_bb69_e843b627cf35.slice/crio-2126a92cc04a08ec64588b8ed283890c1fdd15a9133abe73dc8b540050740668 WatchSource:0}: Error finding container 2126a92cc04a08ec64588b8ed283890c1fdd15a9133abe73dc8b540050740668: Status 404 returned error can't find the container with id 2126a92cc04a08ec64588b8ed283890c1fdd15a9133abe73dc8b540050740668 Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.134172 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" event={"ID":"00875c9d-e02f-4325-8a8a-d354fdf4cd80","Type":"ContainerStarted","Data":"2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b"} Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.134559 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" podUID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerName="dnsmasq-dns" containerID="cri-o://2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b" gracePeriod=10 Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.134815 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.145704 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerStarted","Data":"e774db61a404214da0ddab611495758c937ad3fd9a2202565bf879ea78a21891"} Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.149595 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-central-agent" containerID="cri-o://2179c865f9814250bd4f2d617e85dbe0f1e81cb9fda3265c86a32d30a2a34b79" gracePeriod=30 Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.149634 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.149816 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="sg-core" containerID="cri-o://8606c9bfff9cd2a4cba16dfe78c4da2b07535c0b9700f29019aa987ce5e57be7" gracePeriod=30 Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.149909 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="proxy-httpd" containerID="cri-o://e774db61a404214da0ddab611495758c937ad3fd9a2202565bf879ea78a21891" gracePeriod=30 Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.150005 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-notification-agent" containerID="cri-o://78ed76c2c1f4af26198c1385456af18bf92248dd6b0384fea2893b66333c5509" gracePeriod=30 Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.154093 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f66cf84d-qjn76" event={"ID":"104e2934-13aa-441b-b330-be153b392e7f","Type":"ContainerStarted","Data":"c2bc0bad73a63836f3b5f7e1858e7d043d3e8d44330d1d892ae6678ea35371be"} Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.154255 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.157135 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.179402 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785547cb47-x77nc" event={"ID":"eb94a64c-1a0c-4a61-bb69-e843b627cf35","Type":"ContainerStarted","Data":"2126a92cc04a08ec64588b8ed283890c1fdd15a9133abe73dc8b540050740668"} Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.185847 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" podStartSLOduration=9.185824127 podStartE2EDuration="9.185824127s" podCreationTimestamp="2025-10-10 07:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:18.164821604 +0000 UTC m=+1145.234412845" watchObservedRunningTime="2025-10-10 07:10:18.185824127 +0000 UTC m=+1145.255415368" Oct 10 07:10:18 crc kubenswrapper[4732]: W1010 07:10:18.198621 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5214162b_4dac_4e5d_bddc_19d4a50dc78f.slice/crio-c7d182ac1b12f5700c3e08cf27671437cf2f305fd440444aec1460593a3b0d3f WatchSource:0}: Error finding container c7d182ac1b12f5700c3e08cf27671437cf2f305fd440444aec1460593a3b0d3f: Status 404 returned error can't find the container with id c7d182ac1b12f5700c3e08cf27671437cf2f305fd440444aec1460593a3b0d3f Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.203501 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57f66cf84d-qjn76" podStartSLOduration=9.203481308 podStartE2EDuration="9.203481308s" podCreationTimestamp="2025-10-10 07:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:18.202259045 +0000 UTC m=+1145.271850286" watchObservedRunningTime="2025-10-10 07:10:18.203481308 +0000 UTC m=+1145.273072549" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.287834 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.275758731 podStartE2EDuration="49.287815747s" podCreationTimestamp="2025-10-10 07:09:29 +0000 UTC" firstStartedPulling="2025-10-10 07:09:30.674778559 +0000 UTC m=+1097.744369800" lastFinishedPulling="2025-10-10 07:10:17.686835575 +0000 UTC m=+1144.756426816" observedRunningTime="2025-10-10 07:10:18.238152533 +0000 UTC m=+1145.307743794" watchObservedRunningTime="2025-10-10 07:10:18.287815747 +0000 UTC m=+1145.357406988" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.353972 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd7dbbffc-c2s4k"] Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.479329 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.743943 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.856843 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-config\") pod \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.856969 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-nb\") pod \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.857090 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-svc\") pod \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.857118 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-swift-storage-0\") pod \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.857214 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p9rl\" (UniqueName: \"kubernetes.io/projected/00875c9d-e02f-4325-8a8a-d354fdf4cd80-kube-api-access-9p9rl\") pod \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.857296 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-sb\") pod \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\" (UID: \"00875c9d-e02f-4325-8a8a-d354fdf4cd80\") " Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.873277 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00875c9d-e02f-4325-8a8a-d354fdf4cd80-kube-api-access-9p9rl" (OuterVolumeSpecName: "kube-api-access-9p9rl") pod "00875c9d-e02f-4325-8a8a-d354fdf4cd80" (UID: "00875c9d-e02f-4325-8a8a-d354fdf4cd80"). InnerVolumeSpecName "kube-api-access-9p9rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.924453 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-config" (OuterVolumeSpecName: "config") pod "00875c9d-e02f-4325-8a8a-d354fdf4cd80" (UID: "00875c9d-e02f-4325-8a8a-d354fdf4cd80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.932991 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00875c9d-e02f-4325-8a8a-d354fdf4cd80" (UID: "00875c9d-e02f-4325-8a8a-d354fdf4cd80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.938485 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00875c9d-e02f-4325-8a8a-d354fdf4cd80" (UID: "00875c9d-e02f-4325-8a8a-d354fdf4cd80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.959553 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.959588 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.959599 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p9rl\" (UniqueName: \"kubernetes.io/projected/00875c9d-e02f-4325-8a8a-d354fdf4cd80-kube-api-access-9p9rl\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.959609 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:18 crc kubenswrapper[4732]: I1010 07:10:18.982334 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00875c9d-e02f-4325-8a8a-d354fdf4cd80" (UID: "00875c9d-e02f-4325-8a8a-d354fdf4cd80"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.010504 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00875c9d-e02f-4325-8a8a-d354fdf4cd80" (UID: "00875c9d-e02f-4325-8a8a-d354fdf4cd80"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.060761 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.061287 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00875c9d-e02f-4325-8a8a-d354fdf4cd80-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.202231 4732 generic.go:334] "Generic (PLEG): container finished" podID="7abb736c-8131-4268-9d1c-3ecf24023962" containerID="8606c9bfff9cd2a4cba16dfe78c4da2b07535c0b9700f29019aa987ce5e57be7" exitCode=2 Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.202508 4732 generic.go:334] "Generic (PLEG): container finished" podID="7abb736c-8131-4268-9d1c-3ecf24023962" containerID="2179c865f9814250bd4f2d617e85dbe0f1e81cb9fda3265c86a32d30a2a34b79" exitCode=0 Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.202298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerDied","Data":"8606c9bfff9cd2a4cba16dfe78c4da2b07535c0b9700f29019aa987ce5e57be7"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.202575 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerDied","Data":"2179c865f9814250bd4f2d617e85dbe0f1e81cb9fda3265c86a32d30a2a34b79"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.205124 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e980e794-2380-4c1e-ae4d-9a27fb5338f0","Type":"ContainerStarted","Data":"06c2dd81779f330feb3e75519f14c0a524bf08695080fc82ebd237c11703dce8"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.206911 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" event={"ID":"39f93b6c-c094-4275-ade1-5b4b2d95143c","Type":"ContainerDied","Data":"9f497b665fa9e42cf7693108cecd316d9307d9fd4c3d03a5df7b8a4863f3c4a6"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.206883 4732 generic.go:334] "Generic (PLEG): container finished" podID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerID="9f497b665fa9e42cf7693108cecd316d9307d9fd4c3d03a5df7b8a4863f3c4a6" exitCode=0 Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.207311 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" event={"ID":"39f93b6c-c094-4275-ade1-5b4b2d95143c","Type":"ContainerStarted","Data":"2cf91532bf467b39018801e8c159865dd55d88b20075e91c3e43538c5f572120"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.258390 4732 generic.go:334] "Generic (PLEG): container finished" podID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerID="2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b" exitCode=0 Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.258493 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" event={"ID":"00875c9d-e02f-4325-8a8a-d354fdf4cd80","Type":"ContainerDied","Data":"2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.258523 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" event={"ID":"00875c9d-e02f-4325-8a8a-d354fdf4cd80","Type":"ContainerDied","Data":"bad124d34071258086cd11f21cd9f6fa3fa95943c9e57caac0be4631adcf1a0f"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.258542 4732 scope.go:117] "RemoveContainer" containerID="2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.258715 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8cd96d85-wz9jb" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.266866 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5214162b-4dac-4e5d-bddc-19d4a50dc78f","Type":"ContainerStarted","Data":"c7d182ac1b12f5700c3e08cf27671437cf2f305fd440444aec1460593a3b0d3f"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.276118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785547cb47-x77nc" event={"ID":"eb94a64c-1a0c-4a61-bb69-e843b627cf35","Type":"ContainerStarted","Data":"7c102b472c047348ed7dff4aff9894c0cde366c8c678aa7233770836081af19e"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.276158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785547cb47-x77nc" event={"ID":"eb94a64c-1a0c-4a61-bb69-e843b627cf35","Type":"ContainerStarted","Data":"2d672d5afe033cb8d13e4990cc214b86467df1b5080405aeaab34bdda430f497"} Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.276187 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.330799 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-785547cb47-x77nc" podStartSLOduration=7.330772227 podStartE2EDuration="7.330772227s" podCreationTimestamp="2025-10-10 07:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:19.294701394 +0000 UTC m=+1146.364292655" watchObservedRunningTime="2025-10-10 07:10:19.330772227 +0000 UTC m=+1146.400363468" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.512767 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8cd96d85-wz9jb"] Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.519911 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8cd96d85-wz9jb"] Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.537333 4732 scope.go:117] "RemoveContainer" containerID="c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.608574 4732 scope.go:117] "RemoveContainer" containerID="2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b" Oct 10 07:10:19 crc kubenswrapper[4732]: E1010 07:10:19.609207 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b\": container with ID starting with 2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b not found: ID does not exist" containerID="2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.609231 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b"} err="failed to get container status \"2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b\": rpc error: code = NotFound desc = could not find container \"2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b\": container with ID starting with 2333a025685018511f18b9cb2489519da97ea5e392da0a2d5cc6ecfb429d225b not found: ID does not exist" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.609253 4732 scope.go:117] "RemoveContainer" containerID="c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a" Oct 10 07:10:19 crc kubenswrapper[4732]: E1010 07:10:19.611663 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a\": container with ID starting with c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a not found: ID does not exist" containerID="c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.611725 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a"} err="failed to get container status \"c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a\": rpc error: code = NotFound desc = could not find container \"c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a\": container with ID starting with c851bf7fbb4b2eaaa5f2dead859489019176c0691e843faa92f576af4fe0f00a not found: ID does not exist" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.678023 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" path="/var/lib/kubelet/pods/00875c9d-e02f-4325-8a8a-d354fdf4cd80/volumes" Oct 10 07:10:19 crc kubenswrapper[4732]: I1010 07:10:19.754297 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:20 crc kubenswrapper[4732]: I1010 07:10:20.342160 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e980e794-2380-4c1e-ae4d-9a27fb5338f0","Type":"ContainerStarted","Data":"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0"} Oct 10 07:10:20 crc kubenswrapper[4732]: I1010 07:10:20.346372 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" event={"ID":"39f93b6c-c094-4275-ade1-5b4b2d95143c","Type":"ContainerStarted","Data":"a9f1b48810e30ab0b71246cf0f089634d34c6dc36d182577ed0def84fd0ff702"} Oct 10 07:10:20 crc kubenswrapper[4732]: I1010 07:10:20.348592 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:20 crc kubenswrapper[4732]: I1010 07:10:20.359328 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5214162b-4dac-4e5d-bddc-19d4a50dc78f","Type":"ContainerStarted","Data":"b4b2ea39531c352e667b45cdfea60f2f3a94af65576365d532b6f8768095c15d"} Oct 10 07:10:20 crc kubenswrapper[4732]: I1010 07:10:20.369534 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" podStartSLOduration=3.369516081 podStartE2EDuration="3.369516081s" podCreationTimestamp="2025-10-10 07:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:20.367426294 +0000 UTC m=+1147.437017565" watchObservedRunningTime="2025-10-10 07:10:20.369516081 +0000 UTC m=+1147.439107322" Oct 10 07:10:20 crc kubenswrapper[4732]: I1010 07:10:20.370019 4732 generic.go:334] "Generic (PLEG): container finished" podID="7abb736c-8131-4268-9d1c-3ecf24023962" containerID="78ed76c2c1f4af26198c1385456af18bf92248dd6b0384fea2893b66333c5509" exitCode=0 Oct 10 07:10:20 crc kubenswrapper[4732]: I1010 07:10:20.370853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerDied","Data":"78ed76c2c1f4af26198c1385456af18bf92248dd6b0384fea2893b66333c5509"} Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.063188 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.244981 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.303241 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5dff44767b-nqf2r"] Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.303476 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5dff44767b-nqf2r" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api-log" containerID="cri-o://5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d" gracePeriod=30 Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.303609 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5dff44767b-nqf2r" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api" containerID="cri-o://101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42" gracePeriod=30 Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.380242 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5214162b-4dac-4e5d-bddc-19d4a50dc78f","Type":"ContainerStarted","Data":"5ef152d9f705dbe7de3f5ace6ea499bcb61ba603bd1630cc2c6c22546722c73b"} Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.384079 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api-log" containerID="cri-o://b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0" gracePeriod=30 Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.384307 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e980e794-2380-4c1e-ae4d-9a27fb5338f0","Type":"ContainerStarted","Data":"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7"} Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.384897 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.384947 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api" containerID="cri-o://128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7" gracePeriod=30 Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.437244 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.556733715 podStartE2EDuration="5.437226706s" podCreationTimestamp="2025-10-10 07:10:16 +0000 UTC" firstStartedPulling="2025-10-10 07:10:18.21088359 +0000 UTC m=+1145.280474831" lastFinishedPulling="2025-10-10 07:10:19.091376581 +0000 UTC m=+1146.160967822" observedRunningTime="2025-10-10 07:10:21.432293661 +0000 UTC m=+1148.501884912" watchObservedRunningTime="2025-10-10 07:10:21.437226706 +0000 UTC m=+1148.506817947" Oct 10 07:10:21 crc kubenswrapper[4732]: I1010 07:10:21.454978 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.454955649 podStartE2EDuration="4.454955649s" podCreationTimestamp="2025-10-10 07:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:21.453120759 +0000 UTC m=+1148.522712020" watchObservedRunningTime="2025-10-10 07:10:21.454955649 +0000 UTC m=+1148.524546910" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.010605 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.137008 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-combined-ca-bundle\") pod \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.137108 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-scripts\") pod \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.137155 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e980e794-2380-4c1e-ae4d-9a27fb5338f0-etc-machine-id\") pod \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.137208 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxcbj\" (UniqueName: \"kubernetes.io/projected/e980e794-2380-4c1e-ae4d-9a27fb5338f0-kube-api-access-lxcbj\") pod \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.137277 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e980e794-2380-4c1e-ae4d-9a27fb5338f0-logs\") pod \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.137313 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data\") pod \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.137369 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data-custom\") pod \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\" (UID: \"e980e794-2380-4c1e-ae4d-9a27fb5338f0\") " Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.140173 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e980e794-2380-4c1e-ae4d-9a27fb5338f0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e980e794-2380-4c1e-ae4d-9a27fb5338f0" (UID: "e980e794-2380-4c1e-ae4d-9a27fb5338f0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.142902 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e980e794-2380-4c1e-ae4d-9a27fb5338f0-logs" (OuterVolumeSpecName: "logs") pod "e980e794-2380-4c1e-ae4d-9a27fb5338f0" (UID: "e980e794-2380-4c1e-ae4d-9a27fb5338f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.143944 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e980e794-2380-4c1e-ae4d-9a27fb5338f0" (UID: "e980e794-2380-4c1e-ae4d-9a27fb5338f0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.143992 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-scripts" (OuterVolumeSpecName: "scripts") pod "e980e794-2380-4c1e-ae4d-9a27fb5338f0" (UID: "e980e794-2380-4c1e-ae4d-9a27fb5338f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.144180 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e980e794-2380-4c1e-ae4d-9a27fb5338f0-kube-api-access-lxcbj" (OuterVolumeSpecName: "kube-api-access-lxcbj") pod "e980e794-2380-4c1e-ae4d-9a27fb5338f0" (UID: "e980e794-2380-4c1e-ae4d-9a27fb5338f0"). InnerVolumeSpecName "kube-api-access-lxcbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.171882 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e980e794-2380-4c1e-ae4d-9a27fb5338f0" (UID: "e980e794-2380-4c1e-ae4d-9a27fb5338f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.199366 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data" (OuterVolumeSpecName: "config-data") pod "e980e794-2380-4c1e-ae4d-9a27fb5338f0" (UID: "e980e794-2380-4c1e-ae4d-9a27fb5338f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.240111 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.240151 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.240164 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.240175 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e980e794-2380-4c1e-ae4d-9a27fb5338f0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.240189 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxcbj\" (UniqueName: \"kubernetes.io/projected/e980e794-2380-4c1e-ae4d-9a27fb5338f0-kube-api-access-lxcbj\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.240203 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e980e794-2380-4c1e-ae4d-9a27fb5338f0-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.240213 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e980e794-2380-4c1e-ae4d-9a27fb5338f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.247843 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.401434 4732 generic.go:334] "Generic (PLEG): container finished" podID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerID="128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7" exitCode=0 Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.401467 4732 generic.go:334] "Generic (PLEG): container finished" podID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerID="b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0" exitCode=143 Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.401527 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e980e794-2380-4c1e-ae4d-9a27fb5338f0","Type":"ContainerDied","Data":"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7"} Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.401581 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e980e794-2380-4c1e-ae4d-9a27fb5338f0","Type":"ContainerDied","Data":"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0"} Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.401594 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e980e794-2380-4c1e-ae4d-9a27fb5338f0","Type":"ContainerDied","Data":"06c2dd81779f330feb3e75519f14c0a524bf08695080fc82ebd237c11703dce8"} Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.401608 4732 scope.go:117] "RemoveContainer" containerID="128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.401765 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.406542 4732 generic.go:334] "Generic (PLEG): container finished" podID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerID="5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d" exitCode=143 Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.406855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff44767b-nqf2r" event={"ID":"9deab2aa-5aca-4273-97de-da95bd0da4ab","Type":"ContainerDied","Data":"5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d"} Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.432066 4732 scope.go:117] "RemoveContainer" containerID="b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.495364 4732 scope.go:117] "RemoveContainer" containerID="128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7" Oct 10 07:10:22 crc kubenswrapper[4732]: E1010 07:10:22.495912 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7\": container with ID starting with 128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7 not found: ID does not exist" containerID="128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.495953 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7"} err="failed to get container status \"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7\": rpc error: code = NotFound desc = could not find container \"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7\": container with ID starting with 128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7 not found: ID does not exist" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.495990 4732 scope.go:117] "RemoveContainer" containerID="b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0" Oct 10 07:10:22 crc kubenswrapper[4732]: E1010 07:10:22.496290 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0\": container with ID starting with b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0 not found: ID does not exist" containerID="b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.496313 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0"} err="failed to get container status \"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0\": rpc error: code = NotFound desc = could not find container \"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0\": container with ID starting with b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0 not found: ID does not exist" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.496327 4732 scope.go:117] "RemoveContainer" containerID="128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.496545 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7"} err="failed to get container status \"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7\": rpc error: code = NotFound desc = could not find container \"128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7\": container with ID starting with 128c70e4f56b6166e459fd6db41938e2853fc5c33186d6d1c6aa86b9c34c2bf7 not found: ID does not exist" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.496567 4732 scope.go:117] "RemoveContainer" containerID="b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.496873 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0"} err="failed to get container status \"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0\": rpc error: code = NotFound desc = could not find container \"b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0\": container with ID starting with b7adb9f67b3d065782af562532e048f12d511a441d0d76625e761ce4179290a0 not found: ID does not exist" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.497576 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.507920 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.523411 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:22 crc kubenswrapper[4732]: E1010 07:10:22.523922 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerName="dnsmasq-dns" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.523948 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerName="dnsmasq-dns" Oct 10 07:10:22 crc kubenswrapper[4732]: E1010 07:10:22.523969 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api-log" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.523977 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api-log" Oct 10 07:10:22 crc kubenswrapper[4732]: E1010 07:10:22.524017 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerName="init" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.524025 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerName="init" Oct 10 07:10:22 crc kubenswrapper[4732]: E1010 07:10:22.524036 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.524043 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.524248 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api-log" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.524266 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="00875c9d-e02f-4325-8a8a-d354fdf4cd80" containerName="dnsmasq-dns" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.524288 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" containerName="cinder-api" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.525815 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.529178 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.529435 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.529646 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.537128 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.655992 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb179b69-8c25-49b1-88b5-6c17953ffbcd-logs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656206 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4nq\" (UniqueName: \"kubernetes.io/projected/cb179b69-8c25-49b1-88b5-6c17953ffbcd-kube-api-access-ws4nq\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656278 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-scripts\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656304 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656420 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656558 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb179b69-8c25-49b1-88b5-6c17953ffbcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656679 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.656709 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.758724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4nq\" (UniqueName: \"kubernetes.io/projected/cb179b69-8c25-49b1-88b5-6c17953ffbcd-kube-api-access-ws4nq\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.758820 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-scripts\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.758849 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.758909 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.758957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb179b69-8c25-49b1-88b5-6c17953ffbcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.758988 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.759005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.759020 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.759048 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb179b69-8c25-49b1-88b5-6c17953ffbcd-logs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.759187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb179b69-8c25-49b1-88b5-6c17953ffbcd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.759545 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb179b69-8c25-49b1-88b5-6c17953ffbcd-logs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.764132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.764996 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.765345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-scripts\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.765485 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.768390 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.778615 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.786212 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4nq\" (UniqueName: \"kubernetes.io/projected/cb179b69-8c25-49b1-88b5-6c17953ffbcd-kube-api-access-ws4nq\") pod \"cinder-api-0\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " pod="openstack/cinder-api-0" Oct 10 07:10:22 crc kubenswrapper[4732]: I1010 07:10:22.853240 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:10:23 crc kubenswrapper[4732]: I1010 07:10:23.305724 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:10:23 crc kubenswrapper[4732]: W1010 07:10:23.309004 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb179b69_8c25_49b1_88b5_6c17953ffbcd.slice/crio-552f92891e8da3df71277c717d79fbdb0d0cb1e1b3f98cc5f356bc509ce33025 WatchSource:0}: Error finding container 552f92891e8da3df71277c717d79fbdb0d0cb1e1b3f98cc5f356bc509ce33025: Status 404 returned error can't find the container with id 552f92891e8da3df71277c717d79fbdb0d0cb1e1b3f98cc5f356bc509ce33025 Oct 10 07:10:23 crc kubenswrapper[4732]: I1010 07:10:23.428321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb179b69-8c25-49b1-88b5-6c17953ffbcd","Type":"ContainerStarted","Data":"552f92891e8da3df71277c717d79fbdb0d0cb1e1b3f98cc5f356bc509ce33025"} Oct 10 07:10:23 crc kubenswrapper[4732]: I1010 07:10:23.676972 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e980e794-2380-4c1e-ae4d-9a27fb5338f0" path="/var/lib/kubelet/pods/e980e794-2380-4c1e-ae4d-9a27fb5338f0/volumes" Oct 10 07:10:24 crc kubenswrapper[4732]: I1010 07:10:24.441378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb179b69-8c25-49b1-88b5-6c17953ffbcd","Type":"ContainerStarted","Data":"f4d100a6f0ecddf29e4d78c67e5019b69856d21ffbb0b74a4b0262657c6c0304"} Oct 10 07:10:24 crc kubenswrapper[4732]: I1010 07:10:24.441952 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 07:10:24 crc kubenswrapper[4732]: I1010 07:10:24.441967 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb179b69-8c25-49b1-88b5-6c17953ffbcd","Type":"ContainerStarted","Data":"10d139c980b60b956965d5489b41546b308f38d9be28a63508a7305b312c69c4"} Oct 10 07:10:24 crc kubenswrapper[4732]: I1010 07:10:24.465494 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.465470182 podStartE2EDuration="2.465470182s" podCreationTimestamp="2025-10-10 07:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:24.462006607 +0000 UTC m=+1151.531597868" watchObservedRunningTime="2025-10-10 07:10:24.465470182 +0000 UTC m=+1151.535061433" Oct 10 07:10:24 crc kubenswrapper[4732]: I1010 07:10:24.884122 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.001316 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg7pv\" (UniqueName: \"kubernetes.io/projected/9deab2aa-5aca-4273-97de-da95bd0da4ab-kube-api-access-pg7pv\") pod \"9deab2aa-5aca-4273-97de-da95bd0da4ab\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.001647 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data-custom\") pod \"9deab2aa-5aca-4273-97de-da95bd0da4ab\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.001812 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-combined-ca-bundle\") pod \"9deab2aa-5aca-4273-97de-da95bd0da4ab\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.001858 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9deab2aa-5aca-4273-97de-da95bd0da4ab-logs\") pod \"9deab2aa-5aca-4273-97de-da95bd0da4ab\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.001905 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data\") pod \"9deab2aa-5aca-4273-97de-da95bd0da4ab\" (UID: \"9deab2aa-5aca-4273-97de-da95bd0da4ab\") " Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.002416 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9deab2aa-5aca-4273-97de-da95bd0da4ab-logs" (OuterVolumeSpecName: "logs") pod "9deab2aa-5aca-4273-97de-da95bd0da4ab" (UID: "9deab2aa-5aca-4273-97de-da95bd0da4ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.009103 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9deab2aa-5aca-4273-97de-da95bd0da4ab-kube-api-access-pg7pv" (OuterVolumeSpecName: "kube-api-access-pg7pv") pod "9deab2aa-5aca-4273-97de-da95bd0da4ab" (UID: "9deab2aa-5aca-4273-97de-da95bd0da4ab"). InnerVolumeSpecName "kube-api-access-pg7pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.013255 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9deab2aa-5aca-4273-97de-da95bd0da4ab" (UID: "9deab2aa-5aca-4273-97de-da95bd0da4ab"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.034944 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9deab2aa-5aca-4273-97de-da95bd0da4ab" (UID: "9deab2aa-5aca-4273-97de-da95bd0da4ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.062464 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data" (OuterVolumeSpecName: "config-data") pod "9deab2aa-5aca-4273-97de-da95bd0da4ab" (UID: "9deab2aa-5aca-4273-97de-da95bd0da4ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.103464 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg7pv\" (UniqueName: \"kubernetes.io/projected/9deab2aa-5aca-4273-97de-da95bd0da4ab-kube-api-access-pg7pv\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.103503 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.103515 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.103527 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9deab2aa-5aca-4273-97de-da95bd0da4ab-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.103541 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9deab2aa-5aca-4273-97de-da95bd0da4ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.455502 4732 generic.go:334] "Generic (PLEG): container finished" podID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerID="101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42" exitCode=0 Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.455573 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5dff44767b-nqf2r" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.455609 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff44767b-nqf2r" event={"ID":"9deab2aa-5aca-4273-97de-da95bd0da4ab","Type":"ContainerDied","Data":"101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42"} Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.455660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5dff44767b-nqf2r" event={"ID":"9deab2aa-5aca-4273-97de-da95bd0da4ab","Type":"ContainerDied","Data":"171e52f14aba76308be98bfafe3975f022f1da4efba729fb02b9155106d02994"} Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.455682 4732 scope.go:117] "RemoveContainer" containerID="101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.505444 4732 scope.go:117] "RemoveContainer" containerID="5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.508565 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5dff44767b-nqf2r"] Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.527299 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5dff44767b-nqf2r"] Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.560650 4732 scope.go:117] "RemoveContainer" containerID="101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42" Oct 10 07:10:25 crc kubenswrapper[4732]: E1010 07:10:25.561044 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42\": container with ID starting with 101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42 not found: ID does not exist" containerID="101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.561099 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42"} err="failed to get container status \"101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42\": rpc error: code = NotFound desc = could not find container \"101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42\": container with ID starting with 101dc77e6c55a326c31847c5822f78cc5fc848f33c3b678ae6880d6428980b42 not found: ID does not exist" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.561129 4732 scope.go:117] "RemoveContainer" containerID="5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d" Oct 10 07:10:25 crc kubenswrapper[4732]: E1010 07:10:25.561461 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d\": container with ID starting with 5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d not found: ID does not exist" containerID="5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.561495 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d"} err="failed to get container status \"5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d\": rpc error: code = NotFound desc = could not find container \"5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d\": container with ID starting with 5a463523028662a3c1327e97d8b3b099c103d7076d12ac789ce0334ef1f2752d not found: ID does not exist" Oct 10 07:10:25 crc kubenswrapper[4732]: I1010 07:10:25.673700 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" path="/var/lib/kubelet/pods/9deab2aa-5aca-4273-97de-da95bd0da4ab/volumes" Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.319861 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.396203 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d94456597-gz79x"] Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.396430 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d94456597-gz79x" podUID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerName="dnsmasq-dns" containerID="cri-o://9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb" gracePeriod=10 Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.493862 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.535296 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.914409 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.961055 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-nb\") pod \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.961179 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5b7j\" (UniqueName: \"kubernetes.io/projected/f5d0e89e-6480-4747-a693-9f3cac1fb87d-kube-api-access-s5b7j\") pod \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.961214 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-svc\") pod \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.961278 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-sb\") pod \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.961357 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-config\") pod \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.961380 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-swift-storage-0\") pod \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\" (UID: \"f5d0e89e-6480-4747-a693-9f3cac1fb87d\") " Oct 10 07:10:27 crc kubenswrapper[4732]: I1010 07:10:27.969271 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d0e89e-6480-4747-a693-9f3cac1fb87d-kube-api-access-s5b7j" (OuterVolumeSpecName: "kube-api-access-s5b7j") pod "f5d0e89e-6480-4747-a693-9f3cac1fb87d" (UID: "f5d0e89e-6480-4747-a693-9f3cac1fb87d"). InnerVolumeSpecName "kube-api-access-s5b7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.017487 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5d0e89e-6480-4747-a693-9f3cac1fb87d" (UID: "f5d0e89e-6480-4747-a693-9f3cac1fb87d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.018787 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f5d0e89e-6480-4747-a693-9f3cac1fb87d" (UID: "f5d0e89e-6480-4747-a693-9f3cac1fb87d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.025508 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5d0e89e-6480-4747-a693-9f3cac1fb87d" (UID: "f5d0e89e-6480-4747-a693-9f3cac1fb87d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.026207 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5d0e89e-6480-4747-a693-9f3cac1fb87d" (UID: "f5d0e89e-6480-4747-a693-9f3cac1fb87d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.033765 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-config" (OuterVolumeSpecName: "config") pod "f5d0e89e-6480-4747-a693-9f3cac1fb87d" (UID: "f5d0e89e-6480-4747-a693-9f3cac1fb87d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.062756 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.062785 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.062794 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5b7j\" (UniqueName: \"kubernetes.io/projected/f5d0e89e-6480-4747-a693-9f3cac1fb87d-kube-api-access-s5b7j\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.062805 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.062814 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.062821 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0e89e-6480-4747-a693-9f3cac1fb87d-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.503543 4732 generic.go:334] "Generic (PLEG): container finished" podID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerID="9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb" exitCode=0 Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.503960 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="cinder-scheduler" containerID="cri-o://b4b2ea39531c352e667b45cdfea60f2f3a94af65576365d532b6f8768095c15d" gracePeriod=30 Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.503740 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94456597-gz79x" event={"ID":"f5d0e89e-6480-4747-a693-9f3cac1fb87d","Type":"ContainerDied","Data":"9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb"} Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.504210 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d94456597-gz79x" event={"ID":"f5d0e89e-6480-4747-a693-9f3cac1fb87d","Type":"ContainerDied","Data":"57832ce2ecacfeff47b3189dd47a53478f22570c9b085158d7000f2f99316d35"} Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.504231 4732 scope.go:117] "RemoveContainer" containerID="9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.504495 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="probe" containerID="cri-o://5ef152d9f705dbe7de3f5ace6ea499bcb61ba603bd1630cc2c6c22546722c73b" gracePeriod=30 Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.504843 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d94456597-gz79x" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.558051 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d94456597-gz79x"] Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.562082 4732 scope.go:117] "RemoveContainer" containerID="56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.565202 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d94456597-gz79x"] Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.581106 4732 scope.go:117] "RemoveContainer" containerID="9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb" Oct 10 07:10:28 crc kubenswrapper[4732]: E1010 07:10:28.581498 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb\": container with ID starting with 9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb not found: ID does not exist" containerID="9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.581544 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb"} err="failed to get container status \"9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb\": rpc error: code = NotFound desc = could not find container \"9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb\": container with ID starting with 9bcb118e0955e297ec61bd028538c912fb8bb546ece54862b35891c6316f01cb not found: ID does not exist" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.581575 4732 scope.go:117] "RemoveContainer" containerID="56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63" Oct 10 07:10:28 crc kubenswrapper[4732]: E1010 07:10:28.582200 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63\": container with ID starting with 56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63 not found: ID does not exist" containerID="56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63" Oct 10 07:10:28 crc kubenswrapper[4732]: I1010 07:10:28.582233 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63"} err="failed to get container status \"56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63\": rpc error: code = NotFound desc = could not find container \"56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63\": container with ID starting with 56dc9836340f700cacc727b4818978dbd46f709307347ab73439968fb52bfa63 not found: ID does not exist" Oct 10 07:10:29 crc kubenswrapper[4732]: E1010 07:10:29.496893 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5214162b_4dac_4e5d_bddc_19d4a50dc78f.slice/crio-conmon-5ef152d9f705dbe7de3f5ace6ea499bcb61ba603bd1630cc2c6c22546722c73b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5214162b_4dac_4e5d_bddc_19d4a50dc78f.slice/crio-5ef152d9f705dbe7de3f5ace6ea499bcb61ba603bd1630cc2c6c22546722c73b.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:10:29 crc kubenswrapper[4732]: I1010 07:10:29.516643 4732 generic.go:334] "Generic (PLEG): container finished" podID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerID="5ef152d9f705dbe7de3f5ace6ea499bcb61ba603bd1630cc2c6c22546722c73b" exitCode=0 Oct 10 07:10:29 crc kubenswrapper[4732]: I1010 07:10:29.516725 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5214162b-4dac-4e5d-bddc-19d4a50dc78f","Type":"ContainerDied","Data":"5ef152d9f705dbe7de3f5ace6ea499bcb61ba603bd1630cc2c6c22546722c73b"} Oct 10 07:10:29 crc kubenswrapper[4732]: I1010 07:10:29.671335 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" path="/var/lib/kubelet/pods/f5d0e89e-6480-4747-a693-9f3cac1fb87d/volumes" Oct 10 07:10:30 crc kubenswrapper[4732]: I1010 07:10:30.027929 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.541643 4732 generic.go:334] "Generic (PLEG): container finished" podID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerID="b4b2ea39531c352e667b45cdfea60f2f3a94af65576365d532b6f8768095c15d" exitCode=0 Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.541837 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5214162b-4dac-4e5d-bddc-19d4a50dc78f","Type":"ContainerDied","Data":"b4b2ea39531c352e667b45cdfea60f2f3a94af65576365d532b6f8768095c15d"} Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.645420 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.756334 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-combined-ca-bundle\") pod \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.756411 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65lhf\" (UniqueName: \"kubernetes.io/projected/5214162b-4dac-4e5d-bddc-19d4a50dc78f-kube-api-access-65lhf\") pod \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.756440 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data\") pod \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.756514 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data-custom\") pod \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.756569 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5214162b-4dac-4e5d-bddc-19d4a50dc78f-etc-machine-id\") pod \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.756681 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-scripts\") pod \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\" (UID: \"5214162b-4dac-4e5d-bddc-19d4a50dc78f\") " Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.757070 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5214162b-4dac-4e5d-bddc-19d4a50dc78f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5214162b-4dac-4e5d-bddc-19d4a50dc78f" (UID: "5214162b-4dac-4e5d-bddc-19d4a50dc78f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.764947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-scripts" (OuterVolumeSpecName: "scripts") pod "5214162b-4dac-4e5d-bddc-19d4a50dc78f" (UID: "5214162b-4dac-4e5d-bddc-19d4a50dc78f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.776287 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5214162b-4dac-4e5d-bddc-19d4a50dc78f" (UID: "5214162b-4dac-4e5d-bddc-19d4a50dc78f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.776677 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5214162b-4dac-4e5d-bddc-19d4a50dc78f-kube-api-access-65lhf" (OuterVolumeSpecName: "kube-api-access-65lhf") pod "5214162b-4dac-4e5d-bddc-19d4a50dc78f" (UID: "5214162b-4dac-4e5d-bddc-19d4a50dc78f"). InnerVolumeSpecName "kube-api-access-65lhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.821782 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5214162b-4dac-4e5d-bddc-19d4a50dc78f" (UID: "5214162b-4dac-4e5d-bddc-19d4a50dc78f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.860058 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.860252 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.860275 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65lhf\" (UniqueName: \"kubernetes.io/projected/5214162b-4dac-4e5d-bddc-19d4a50dc78f-kube-api-access-65lhf\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.860310 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.860321 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5214162b-4dac-4e5d-bddc-19d4a50dc78f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.870865 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data" (OuterVolumeSpecName: "config-data") pod "5214162b-4dac-4e5d-bddc-19d4a50dc78f" (UID: "5214162b-4dac-4e5d-bddc-19d4a50dc78f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:31 crc kubenswrapper[4732]: I1010 07:10:31.961577 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5214162b-4dac-4e5d-bddc-19d4a50dc78f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.557992 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"5214162b-4dac-4e5d-bddc-19d4a50dc78f","Type":"ContainerDied","Data":"c7d182ac1b12f5700c3e08cf27671437cf2f305fd440444aec1460593a3b0d3f"} Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.558035 4732 scope.go:117] "RemoveContainer" containerID="5ef152d9f705dbe7de3f5ace6ea499bcb61ba603bd1630cc2c6c22546722c73b" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.558111 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.589921 4732 scope.go:117] "RemoveContainer" containerID="b4b2ea39531c352e667b45cdfea60f2f3a94af65576365d532b6f8768095c15d" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.605410 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.613062 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637032 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:32 crc kubenswrapper[4732]: E1010 07:10:32.637369 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="probe" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637382 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="probe" Oct 10 07:10:32 crc kubenswrapper[4732]: E1010 07:10:32.637394 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerName="dnsmasq-dns" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637400 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerName="dnsmasq-dns" Oct 10 07:10:32 crc kubenswrapper[4732]: E1010 07:10:32.637425 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api-log" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637432 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api-log" Oct 10 07:10:32 crc kubenswrapper[4732]: E1010 07:10:32.637442 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="cinder-scheduler" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637450 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="cinder-scheduler" Oct 10 07:10:32 crc kubenswrapper[4732]: E1010 07:10:32.637472 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerName="init" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637480 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerName="init" Oct 10 07:10:32 crc kubenswrapper[4732]: E1010 07:10:32.637494 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637502 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637655 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d0e89e-6480-4747-a693-9f3cac1fb87d" containerName="dnsmasq-dns" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637675 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api-log" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637692 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9deab2aa-5aca-4273-97de-da95bd0da4ab" containerName="barbican-api" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637720 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="cinder-scheduler" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.637729 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" containerName="probe" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.638573 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.642444 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.648510 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.677487 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.677755 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/27b69405-bc4b-4e39-be49-0a966bc649bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.677901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.677993 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.678160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gppq\" (UniqueName: \"kubernetes.io/projected/27b69405-bc4b-4e39-be49-0a966bc649bb-kube-api-access-7gppq\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.678481 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.779868 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.780205 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.780306 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/27b69405-bc4b-4e39-be49-0a966bc649bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.780420 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.780528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.780632 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/27b69405-bc4b-4e39-be49-0a966bc649bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.780815 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gppq\" (UniqueName: \"kubernetes.io/projected/27b69405-bc4b-4e39-be49-0a966bc649bb-kube-api-access-7gppq\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.784996 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.785151 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.785254 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.785399 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.814325 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gppq\" (UniqueName: \"kubernetes.io/projected/27b69405-bc4b-4e39-be49-0a966bc649bb-kube-api-access-7gppq\") pod \"cinder-scheduler-0\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " pod="openstack/cinder-scheduler-0" Oct 10 07:10:32 crc kubenswrapper[4732]: I1010 07:10:32.968784 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:10:33 crc kubenswrapper[4732]: I1010 07:10:33.392524 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:10:33 crc kubenswrapper[4732]: I1010 07:10:33.566623 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"27b69405-bc4b-4e39-be49-0a966bc649bb","Type":"ContainerStarted","Data":"33f47b720904e70a98f517a0875c1ad11459a2f94b9d524e5f239ba0d5cb06d7"} Oct 10 07:10:33 crc kubenswrapper[4732]: I1010 07:10:33.676282 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5214162b-4dac-4e5d-bddc-19d4a50dc78f" path="/var/lib/kubelet/pods/5214162b-4dac-4e5d-bddc-19d4a50dc78f/volumes" Oct 10 07:10:34 crc kubenswrapper[4732]: I1010 07:10:34.583304 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"27b69405-bc4b-4e39-be49-0a966bc649bb","Type":"ContainerStarted","Data":"698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686"} Oct 10 07:10:34 crc kubenswrapper[4732]: I1010 07:10:34.739949 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:34 crc kubenswrapper[4732]: I1010 07:10:34.806111 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 10 07:10:34 crc kubenswrapper[4732]: I1010 07:10:34.852515 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:10:35 crc kubenswrapper[4732]: I1010 07:10:35.230742 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:10:35 crc kubenswrapper[4732]: I1010 07:10:35.594463 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"27b69405-bc4b-4e39-be49-0a966bc649bb","Type":"ContainerStarted","Data":"335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d"} Oct 10 07:10:35 crc kubenswrapper[4732]: I1010 07:10:35.626262 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.626225389 podStartE2EDuration="3.626225389s" podCreationTimestamp="2025-10-10 07:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:35.614333225 +0000 UTC m=+1162.683924476" watchObservedRunningTime="2025-10-10 07:10:35.626225389 +0000 UTC m=+1162.695816640" Oct 10 07:10:37 crc kubenswrapper[4732]: I1010 07:10:37.969796 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.341937 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76d49dbb9c-8g2mb"] Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.343771 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.351442 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76d49dbb9c-8g2mb"] Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.360228 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.360504 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.360531 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497228 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsq5c\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-kube-api-access-zsq5c\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497287 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-log-httpd\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497335 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-config-data\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-internal-tls-certs\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497426 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-public-tls-certs\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497474 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-run-httpd\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497498 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-etc-swift\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.497560 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-combined-ca-bundle\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.599611 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-combined-ca-bundle\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.600759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsq5c\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-kube-api-access-zsq5c\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.600880 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-log-httpd\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.600980 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-config-data\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.601058 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-internal-tls-certs\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.601182 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-public-tls-certs\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.601312 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-run-httpd\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.601416 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-etc-swift\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.602872 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-run-httpd\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.601331 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-log-httpd\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.607543 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-config-data\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.608039 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-internal-tls-certs\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.608304 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-etc-swift\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.608895 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-public-tls-certs\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.609308 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-combined-ca-bundle\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.619208 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsq5c\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-kube-api-access-zsq5c\") pod \"swift-proxy-76d49dbb9c-8g2mb\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.620661 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:39 crc kubenswrapper[4732]: I1010 07:10:39.672351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.171686 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.181206 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.181365 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.184417 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.184565 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gzx26" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.185224 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.310856 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76d49dbb9c-8g2mb"] Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.337436 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config-secret\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.337836 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.337873 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9dl4\" (UniqueName: \"kubernetes.io/projected/b764abfb-d26c-46a0-a3b6-55f388fb593b-kube-api-access-m9dl4\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.337909 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.439773 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config-secret\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.439831 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.439865 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9dl4\" (UniqueName: \"kubernetes.io/projected/b764abfb-d26c-46a0-a3b6-55f388fb593b-kube-api-access-m9dl4\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.439897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.440536 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.441063 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: E1010 07:10:40.441306 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-m9dl4 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="b764abfb-d26c-46a0-a3b6-55f388fb593b" Oct 10 07:10:40 crc kubenswrapper[4732]: E1010 07:10:40.443528 4732 projected.go:194] Error preparing data for projected volume kube-api-access-m9dl4 for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 10 07:10:40 crc kubenswrapper[4732]: E1010 07:10:40.443609 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b764abfb-d26c-46a0-a3b6-55f388fb593b-kube-api-access-m9dl4 podName:b764abfb-d26c-46a0-a3b6-55f388fb593b nodeName:}" failed. No retries permitted until 2025-10-10 07:10:40.943586346 +0000 UTC m=+1168.013177587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m9dl4" (UniqueName: "kubernetes.io/projected/b764abfb-d26c-46a0-a3b6-55f388fb593b-kube-api-access-m9dl4") pod "openstackclient" (UID: "b764abfb-d26c-46a0-a3b6-55f388fb593b") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.445818 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config-secret\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.447324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.451729 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.470104 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.471535 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.478220 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.641241 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.642841 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" event={"ID":"7daaf3e5-82f0-45f7-aa22-40be65433320","Type":"ContainerStarted","Data":"f109eab8f1fb8cc71f4902a857174c8040a04484109856debe238368907364e0"} Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.642905 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" event={"ID":"7daaf3e5-82f0-45f7-aa22-40be65433320","Type":"ContainerStarted","Data":"5874141244f3843be5731f1bb9474d7151f98861d4d7fcf7bccd405260d27426"} Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.643816 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config-secret\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.643860 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.643926 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8f8l\" (UniqueName: \"kubernetes.io/projected/c58c2bae-9347-4644-ae19-ff3781571610-kube-api-access-h8f8l\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.644075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.648147 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b764abfb-d26c-46a0-a3b6-55f388fb593b" podUID="c58c2bae-9347-4644-ae19-ff3781571610" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.733131 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.745437 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8f8l\" (UniqueName: \"kubernetes.io/projected/c58c2bae-9347-4644-ae19-ff3781571610-kube-api-access-h8f8l\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.745610 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.745644 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config-secret\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.745675 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.746574 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.752383 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config-secret\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.752393 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.765275 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8f8l\" (UniqueName: \"kubernetes.io/projected/c58c2bae-9347-4644-ae19-ff3781571610-kube-api-access-h8f8l\") pod \"openstackclient\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.806169 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.849498 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config\") pod \"b764abfb-d26c-46a0-a3b6-55f388fb593b\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.849950 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-combined-ca-bundle\") pod \"b764abfb-d26c-46a0-a3b6-55f388fb593b\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.850009 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config-secret\") pod \"b764abfb-d26c-46a0-a3b6-55f388fb593b\" (UID: \"b764abfb-d26c-46a0-a3b6-55f388fb593b\") " Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.850149 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b764abfb-d26c-46a0-a3b6-55f388fb593b" (UID: "b764abfb-d26c-46a0-a3b6-55f388fb593b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.850664 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.850710 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9dl4\" (UniqueName: \"kubernetes.io/projected/b764abfb-d26c-46a0-a3b6-55f388fb593b-kube-api-access-m9dl4\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.860414 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b764abfb-d26c-46a0-a3b6-55f388fb593b" (UID: "b764abfb-d26c-46a0-a3b6-55f388fb593b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.860505 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b764abfb-d26c-46a0-a3b6-55f388fb593b" (UID: "b764abfb-d26c-46a0-a3b6-55f388fb593b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.953253 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:40 crc kubenswrapper[4732]: I1010 07:10:40.953276 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b764abfb-d26c-46a0-a3b6-55f388fb593b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.366059 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 07:10:41 crc kubenswrapper[4732]: W1010 07:10:41.369309 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc58c2bae_9347_4644_ae19_ff3781571610.slice/crio-363c58ba4f6d7d21dc2e658940fb19f35d302cb44e5fd0363cb4bf57f73cfa63 WatchSource:0}: Error finding container 363c58ba4f6d7d21dc2e658940fb19f35d302cb44e5fd0363cb4bf57f73cfa63: Status 404 returned error can't find the container with id 363c58ba4f6d7d21dc2e658940fb19f35d302cb44e5fd0363cb4bf57f73cfa63 Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.651904 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" event={"ID":"7daaf3e5-82f0-45f7-aa22-40be65433320","Type":"ContainerStarted","Data":"cc76fa90e7b162f0b66e824eee5ff268aceed1434ce758c6900d6e7104073f19"} Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.652185 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.652247 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.653507 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c58c2bae-9347-4644-ae19-ff3781571610","Type":"ContainerStarted","Data":"363c58ba4f6d7d21dc2e658940fb19f35d302cb44e5fd0363cb4bf57f73cfa63"} Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.653548 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.673212 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b764abfb-d26c-46a0-a3b6-55f388fb593b" path="/var/lib/kubelet/pods/b764abfb-d26c-46a0-a3b6-55f388fb593b/volumes" Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.677206 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b764abfb-d26c-46a0-a3b6-55f388fb593b" podUID="c58c2bae-9347-4644-ae19-ff3781571610" Oct 10 07:10:41 crc kubenswrapper[4732]: I1010 07:10:41.678344 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" podStartSLOduration=2.678317925 podStartE2EDuration="2.678317925s" podCreationTimestamp="2025-10-10 07:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:41.670013368 +0000 UTC m=+1168.739604619" watchObservedRunningTime="2025-10-10 07:10:41.678317925 +0000 UTC m=+1168.747909176" Oct 10 07:10:42 crc kubenswrapper[4732]: I1010 07:10:42.380666 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:10:42 crc kubenswrapper[4732]: I1010 07:10:42.454770 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57f66cf84d-qjn76"] Oct 10 07:10:42 crc kubenswrapper[4732]: I1010 07:10:42.455629 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57f66cf84d-qjn76" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-httpd" containerID="cri-o://c2bc0bad73a63836f3b5f7e1858e7d043d3e8d44330d1d892ae6678ea35371be" gracePeriod=30 Oct 10 07:10:42 crc kubenswrapper[4732]: I1010 07:10:42.455038 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-57f66cf84d-qjn76" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-api" containerID="cri-o://25c855b0584ceb27b3166ed7bf66900e6657bef7a1203215356f7e860120abe3" gracePeriod=30 Oct 10 07:10:43 crc kubenswrapper[4732]: I1010 07:10:43.196154 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 07:10:43 crc kubenswrapper[4732]: I1010 07:10:43.708285 4732 generic.go:334] "Generic (PLEG): container finished" podID="104e2934-13aa-441b-b330-be153b392e7f" containerID="c2bc0bad73a63836f3b5f7e1858e7d043d3e8d44330d1d892ae6678ea35371be" exitCode=0 Oct 10 07:10:43 crc kubenswrapper[4732]: I1010 07:10:43.708328 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f66cf84d-qjn76" event={"ID":"104e2934-13aa-441b-b330-be153b392e7f","Type":"ContainerDied","Data":"c2bc0bad73a63836f3b5f7e1858e7d043d3e8d44330d1d892ae6678ea35371be"} Oct 10 07:10:46 crc kubenswrapper[4732]: I1010 07:10:46.738544 4732 generic.go:334] "Generic (PLEG): container finished" podID="104e2934-13aa-441b-b330-be153b392e7f" containerID="25c855b0584ceb27b3166ed7bf66900e6657bef7a1203215356f7e860120abe3" exitCode=0 Oct 10 07:10:46 crc kubenswrapper[4732]: I1010 07:10:46.738616 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f66cf84d-qjn76" event={"ID":"104e2934-13aa-441b-b330-be153b392e7f","Type":"ContainerDied","Data":"25c855b0584ceb27b3166ed7bf66900e6657bef7a1203215356f7e860120abe3"} Oct 10 07:10:48 crc kubenswrapper[4732]: I1010 07:10:48.759463 4732 generic.go:334] "Generic (PLEG): container finished" podID="7abb736c-8131-4268-9d1c-3ecf24023962" containerID="e774db61a404214da0ddab611495758c937ad3fd9a2202565bf879ea78a21891" exitCode=137 Oct 10 07:10:48 crc kubenswrapper[4732]: I1010 07:10:48.759606 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerDied","Data":"e774db61a404214da0ddab611495758c937ad3fd9a2202565bf879ea78a21891"} Oct 10 07:10:49 crc kubenswrapper[4732]: I1010 07:10:49.679847 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:49 crc kubenswrapper[4732]: I1010 07:10:49.681804 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.228033 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.290188 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.356904 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-combined-ca-bundle\") pod \"7abb736c-8131-4268-9d1c-3ecf24023962\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.356990 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-scripts\") pod \"7abb736c-8131-4268-9d1c-3ecf24023962\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357349 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc6mf\" (UniqueName: \"kubernetes.io/projected/7abb736c-8131-4268-9d1c-3ecf24023962-kube-api-access-bc6mf\") pod \"7abb736c-8131-4268-9d1c-3ecf24023962\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357436 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-sg-core-conf-yaml\") pod \"7abb736c-8131-4268-9d1c-3ecf24023962\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357750 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-httpd-config\") pod \"104e2934-13aa-441b-b330-be153b392e7f\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357812 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-config-data\") pod \"7abb736c-8131-4268-9d1c-3ecf24023962\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357846 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-config\") pod \"104e2934-13aa-441b-b330-be153b392e7f\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357883 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-log-httpd\") pod \"7abb736c-8131-4268-9d1c-3ecf24023962\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357959 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-combined-ca-bundle\") pod \"104e2934-13aa-441b-b330-be153b392e7f\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.357994 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-run-httpd\") pod \"7abb736c-8131-4268-9d1c-3ecf24023962\" (UID: \"7abb736c-8131-4268-9d1c-3ecf24023962\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.358034 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shsxk\" (UniqueName: \"kubernetes.io/projected/104e2934-13aa-441b-b330-be153b392e7f-kube-api-access-shsxk\") pod \"104e2934-13aa-441b-b330-be153b392e7f\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.359423 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7abb736c-8131-4268-9d1c-3ecf24023962" (UID: "7abb736c-8131-4268-9d1c-3ecf24023962"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.359469 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7abb736c-8131-4268-9d1c-3ecf24023962" (UID: "7abb736c-8131-4268-9d1c-3ecf24023962"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.362773 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-scripts" (OuterVolumeSpecName: "scripts") pod "7abb736c-8131-4268-9d1c-3ecf24023962" (UID: "7abb736c-8131-4268-9d1c-3ecf24023962"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.363119 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abb736c-8131-4268-9d1c-3ecf24023962-kube-api-access-bc6mf" (OuterVolumeSpecName: "kube-api-access-bc6mf") pod "7abb736c-8131-4268-9d1c-3ecf24023962" (UID: "7abb736c-8131-4268-9d1c-3ecf24023962"). InnerVolumeSpecName "kube-api-access-bc6mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.365022 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "104e2934-13aa-441b-b330-be153b392e7f" (UID: "104e2934-13aa-441b-b330-be153b392e7f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.365991 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/104e2934-13aa-441b-b330-be153b392e7f-kube-api-access-shsxk" (OuterVolumeSpecName: "kube-api-access-shsxk") pod "104e2934-13aa-441b-b330-be153b392e7f" (UID: "104e2934-13aa-441b-b330-be153b392e7f"). InnerVolumeSpecName "kube-api-access-shsxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.391576 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7abb736c-8131-4268-9d1c-3ecf24023962" (UID: "7abb736c-8131-4268-9d1c-3ecf24023962"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.411590 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "104e2934-13aa-441b-b330-be153b392e7f" (UID: "104e2934-13aa-441b-b330-be153b392e7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.421331 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-config" (OuterVolumeSpecName: "config") pod "104e2934-13aa-441b-b330-be153b392e7f" (UID: "104e2934-13aa-441b-b330-be153b392e7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.433843 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7abb736c-8131-4268-9d1c-3ecf24023962" (UID: "7abb736c-8131-4268-9d1c-3ecf24023962"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.459312 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-ovndb-tls-certs\") pod \"104e2934-13aa-441b-b330-be153b392e7f\" (UID: \"104e2934-13aa-441b-b330-be153b392e7f\") " Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.459976 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460006 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460024 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shsxk\" (UniqueName: \"kubernetes.io/projected/104e2934-13aa-441b-b330-be153b392e7f-kube-api-access-shsxk\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460041 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460056 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460073 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc6mf\" (UniqueName: \"kubernetes.io/projected/7abb736c-8131-4268-9d1c-3ecf24023962-kube-api-access-bc6mf\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460089 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460100 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460112 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.460125 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7abb736c-8131-4268-9d1c-3ecf24023962-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.463052 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-config-data" (OuterVolumeSpecName: "config-data") pod "7abb736c-8131-4268-9d1c-3ecf24023962" (UID: "7abb736c-8131-4268-9d1c-3ecf24023962"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.541645 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "104e2934-13aa-441b-b330-be153b392e7f" (UID: "104e2934-13aa-441b-b330-be153b392e7f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.561391 4732 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/104e2934-13aa-441b-b330-be153b392e7f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.561424 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7abb736c-8131-4268-9d1c-3ecf24023962-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.791810 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57f66cf84d-qjn76" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.791803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57f66cf84d-qjn76" event={"ID":"104e2934-13aa-441b-b330-be153b392e7f","Type":"ContainerDied","Data":"46e8c2a21b467b2daedc279a5da3e9a3e039267f3c1b603e92eea776b9e41d3f"} Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.792157 4732 scope.go:117] "RemoveContainer" containerID="c2bc0bad73a63836f3b5f7e1858e7d043d3e8d44330d1d892ae6678ea35371be" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.794220 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c58c2bae-9347-4644-ae19-ff3781571610","Type":"ContainerStarted","Data":"1df0bfc677e23cf3848c6b955400cf7ad114080934e2ff619ab58f33fe07c595"} Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.797321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7abb736c-8131-4268-9d1c-3ecf24023962","Type":"ContainerDied","Data":"93be9b4e14917a6bae6a6b28962440e97b7b6a4188a00f9f3f61566c81fe912f"} Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.797482 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.824517 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-57f66cf84d-qjn76"] Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.826399 4732 scope.go:117] "RemoveContainer" containerID="25c855b0584ceb27b3166ed7bf66900e6657bef7a1203215356f7e860120abe3" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.832499 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-57f66cf84d-qjn76"] Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.840390 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.15033652 podStartE2EDuration="11.840373964s" podCreationTimestamp="2025-10-10 07:10:40 +0000 UTC" firstStartedPulling="2025-10-10 07:10:41.371864396 +0000 UTC m=+1168.441455637" lastFinishedPulling="2025-10-10 07:10:51.06190184 +0000 UTC m=+1178.131493081" observedRunningTime="2025-10-10 07:10:51.835880771 +0000 UTC m=+1178.905472022" watchObservedRunningTime="2025-10-10 07:10:51.840373964 +0000 UTC m=+1178.909965205" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.940264 4732 scope.go:117] "RemoveContainer" containerID="e774db61a404214da0ddab611495758c937ad3fd9a2202565bf879ea78a21891" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.955623 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.967181 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.993275 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:51 crc kubenswrapper[4732]: E1010 07:10:51.993773 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-notification-agent" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.993793 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-notification-agent" Oct 10 07:10:51 crc kubenswrapper[4732]: E1010 07:10:51.993810 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-central-agent" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.993817 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-central-agent" Oct 10 07:10:51 crc kubenswrapper[4732]: E1010 07:10:51.993837 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="proxy-httpd" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.993846 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="proxy-httpd" Oct 10 07:10:51 crc kubenswrapper[4732]: E1010 07:10:51.993868 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="sg-core" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.993875 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="sg-core" Oct 10 07:10:51 crc kubenswrapper[4732]: E1010 07:10:51.993887 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-api" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.993895 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-api" Oct 10 07:10:51 crc kubenswrapper[4732]: E1010 07:10:51.993913 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-httpd" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.993920 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-httpd" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.994129 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="sg-core" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.994150 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-api" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.994177 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="104e2934-13aa-441b-b330-be153b392e7f" containerName="neutron-httpd" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.994191 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-notification-agent" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.994209 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="ceilometer-central-agent" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.994221 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" containerName="proxy-httpd" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.996209 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:51 crc kubenswrapper[4732]: I1010 07:10:51.999832 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.000735 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.004110 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.008721 4732 scope.go:117] "RemoveContainer" containerID="8606c9bfff9cd2a4cba16dfe78c4da2b07535c0b9700f29019aa987ce5e57be7" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.051507 4732 scope.go:117] "RemoveContainer" containerID="78ed76c2c1f4af26198c1385456af18bf92248dd6b0384fea2893b66333c5509" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.070758 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwhw8\" (UniqueName: \"kubernetes.io/projected/bfa52b24-007f-44f2-8bee-613f1446a314-kube-api-access-pwhw8\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.070853 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.071148 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-log-httpd\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.071243 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-scripts\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.071380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.071437 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-config-data\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.071589 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-run-httpd\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.075674 4732 scope.go:117] "RemoveContainer" containerID="2179c865f9814250bd4f2d617e85dbe0f1e81cb9fda3265c86a32d30a2a34b79" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.173372 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-log-httpd\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.173438 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-scripts\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.173491 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.173527 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-config-data\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.173589 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-run-httpd\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.173628 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwhw8\" (UniqueName: \"kubernetes.io/projected/bfa52b24-007f-44f2-8bee-613f1446a314-kube-api-access-pwhw8\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.174755 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.178222 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-run-httpd\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.178809 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-log-httpd\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.179734 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-config-data\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.181652 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.187337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-scripts\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.187505 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.195047 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwhw8\" (UniqueName: \"kubernetes.io/projected/bfa52b24-007f-44f2-8bee-613f1446a314-kube-api-access-pwhw8\") pod \"ceilometer-0\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.325334 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:52 crc kubenswrapper[4732]: I1010 07:10:52.806907 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:52 crc kubenswrapper[4732]: W1010 07:10:52.820519 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfa52b24_007f_44f2_8bee_613f1446a314.slice/crio-4d00702416a37be92d04b7ce31b62ea1c8ba43f1e34e08e9bb72725b9591d2f5 WatchSource:0}: Error finding container 4d00702416a37be92d04b7ce31b62ea1c8ba43f1e34e08e9bb72725b9591d2f5: Status 404 returned error can't find the container with id 4d00702416a37be92d04b7ce31b62ea1c8ba43f1e34e08e9bb72725b9591d2f5 Oct 10 07:10:53 crc kubenswrapper[4732]: I1010 07:10:53.024194 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:53 crc kubenswrapper[4732]: I1010 07:10:53.697767 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="104e2934-13aa-441b-b330-be153b392e7f" path="/var/lib/kubelet/pods/104e2934-13aa-441b-b330-be153b392e7f/volumes" Oct 10 07:10:53 crc kubenswrapper[4732]: I1010 07:10:53.699152 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7abb736c-8131-4268-9d1c-3ecf24023962" path="/var/lib/kubelet/pods/7abb736c-8131-4268-9d1c-3ecf24023962/volumes" Oct 10 07:10:53 crc kubenswrapper[4732]: I1010 07:10:53.821818 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerStarted","Data":"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563"} Oct 10 07:10:53 crc kubenswrapper[4732]: I1010 07:10:53.821865 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerStarted","Data":"4d00702416a37be92d04b7ce31b62ea1c8ba43f1e34e08e9bb72725b9591d2f5"} Oct 10 07:10:54 crc kubenswrapper[4732]: I1010 07:10:54.838870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerStarted","Data":"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d"} Oct 10 07:10:55 crc kubenswrapper[4732]: I1010 07:10:55.850205 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerStarted","Data":"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03"} Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.891307 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerStarted","Data":"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac"} Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.891895 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-central-agent" containerID="cri-o://596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" gracePeriod=30 Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.892182 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.892475 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="proxy-httpd" containerID="cri-o://841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" gracePeriod=30 Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.892541 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="sg-core" containerID="cri-o://1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" gracePeriod=30 Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.892592 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-notification-agent" containerID="cri-o://ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" gracePeriod=30 Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.902852 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pjznz"] Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.904047 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjznz" Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.921678 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pjznz"] Oct 10 07:10:56 crc kubenswrapper[4732]: I1010 07:10:56.931023 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3872257980000002 podStartE2EDuration="5.931000341s" podCreationTimestamp="2025-10-10 07:10:51 +0000 UTC" firstStartedPulling="2025-10-10 07:10:52.823302225 +0000 UTC m=+1179.892893466" lastFinishedPulling="2025-10-10 07:10:56.367076768 +0000 UTC m=+1183.436668009" observedRunningTime="2025-10-10 07:10:56.915056006 +0000 UTC m=+1183.984647247" watchObservedRunningTime="2025-10-10 07:10:56.931000341 +0000 UTC m=+1184.000591582" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.000763 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5kfkd"] Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.002092 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kfkd" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.014647 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5kfkd"] Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.066450 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jzv\" (UniqueName: \"kubernetes.io/projected/7f9aff96-840f-4c4c-8ae2-349dd76e614e-kube-api-access-47jzv\") pod \"nova-api-db-create-pjznz\" (UID: \"7f9aff96-840f-4c4c-8ae2-349dd76e614e\") " pod="openstack/nova-api-db-create-pjznz" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.102442 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-94k8g"] Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.103754 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94k8g" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.114362 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-94k8g"] Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.169644 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psn2s\" (UniqueName: \"kubernetes.io/projected/521796db-f8be-41d3-a251-3ba1101d99bc-kube-api-access-psn2s\") pod \"nova-cell0-db-create-5kfkd\" (UID: \"521796db-f8be-41d3-a251-3ba1101d99bc\") " pod="openstack/nova-cell0-db-create-5kfkd" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.169745 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jzv\" (UniqueName: \"kubernetes.io/projected/7f9aff96-840f-4c4c-8ae2-349dd76e614e-kube-api-access-47jzv\") pod \"nova-api-db-create-pjznz\" (UID: \"7f9aff96-840f-4c4c-8ae2-349dd76e614e\") " pod="openstack/nova-api-db-create-pjznz" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.191621 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jzv\" (UniqueName: \"kubernetes.io/projected/7f9aff96-840f-4c4c-8ae2-349dd76e614e-kube-api-access-47jzv\") pod \"nova-api-db-create-pjznz\" (UID: \"7f9aff96-840f-4c4c-8ae2-349dd76e614e\") " pod="openstack/nova-api-db-create-pjznz" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.222788 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjznz" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.272173 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7ml\" (UniqueName: \"kubernetes.io/projected/20f43028-56f9-42d6-ad26-631d79465b65-kube-api-access-sc7ml\") pod \"nova-cell1-db-create-94k8g\" (UID: \"20f43028-56f9-42d6-ad26-631d79465b65\") " pod="openstack/nova-cell1-db-create-94k8g" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.272633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psn2s\" (UniqueName: \"kubernetes.io/projected/521796db-f8be-41d3-a251-3ba1101d99bc-kube-api-access-psn2s\") pod \"nova-cell0-db-create-5kfkd\" (UID: \"521796db-f8be-41d3-a251-3ba1101d99bc\") " pod="openstack/nova-cell0-db-create-5kfkd" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.287858 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psn2s\" (UniqueName: \"kubernetes.io/projected/521796db-f8be-41d3-a251-3ba1101d99bc-kube-api-access-psn2s\") pod \"nova-cell0-db-create-5kfkd\" (UID: \"521796db-f8be-41d3-a251-3ba1101d99bc\") " pod="openstack/nova-cell0-db-create-5kfkd" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.386562 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7ml\" (UniqueName: \"kubernetes.io/projected/20f43028-56f9-42d6-ad26-631d79465b65-kube-api-access-sc7ml\") pod \"nova-cell1-db-create-94k8g\" (UID: \"20f43028-56f9-42d6-ad26-631d79465b65\") " pod="openstack/nova-cell1-db-create-94k8g" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.407408 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7ml\" (UniqueName: \"kubernetes.io/projected/20f43028-56f9-42d6-ad26-631d79465b65-kube-api-access-sc7ml\") pod \"nova-cell1-db-create-94k8g\" (UID: \"20f43028-56f9-42d6-ad26-631d79465b65\") " pod="openstack/nova-cell1-db-create-94k8g" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.419058 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kfkd" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.435483 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94k8g" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.671724 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pjznz"] Oct 10 07:10:57 crc kubenswrapper[4732]: W1010 07:10:57.672786 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f9aff96_840f_4c4c_8ae2_349dd76e614e.slice/crio-86cab37c7a174ed98fa12a1fb36cb49c8bb6e20bb3c6e70961835ef320afbbc0 WatchSource:0}: Error finding container 86cab37c7a174ed98fa12a1fb36cb49c8bb6e20bb3c6e70961835ef320afbbc0: Status 404 returned error can't find the container with id 86cab37c7a174ed98fa12a1fb36cb49c8bb6e20bb3c6e70961835ef320afbbc0 Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.692435 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.694460 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-sg-core-conf-yaml\") pod \"bfa52b24-007f-44f2-8bee-613f1446a314\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.694527 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwhw8\" (UniqueName: \"kubernetes.io/projected/bfa52b24-007f-44f2-8bee-613f1446a314-kube-api-access-pwhw8\") pod \"bfa52b24-007f-44f2-8bee-613f1446a314\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.694609 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-run-httpd\") pod \"bfa52b24-007f-44f2-8bee-613f1446a314\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.694640 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-log-httpd\") pod \"bfa52b24-007f-44f2-8bee-613f1446a314\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.694664 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-scripts\") pod \"bfa52b24-007f-44f2-8bee-613f1446a314\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.694747 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-combined-ca-bundle\") pod \"bfa52b24-007f-44f2-8bee-613f1446a314\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.694792 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-config-data\") pod \"bfa52b24-007f-44f2-8bee-613f1446a314\" (UID: \"bfa52b24-007f-44f2-8bee-613f1446a314\") " Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.695424 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bfa52b24-007f-44f2-8bee-613f1446a314" (UID: "bfa52b24-007f-44f2-8bee-613f1446a314"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.696126 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bfa52b24-007f-44f2-8bee-613f1446a314" (UID: "bfa52b24-007f-44f2-8bee-613f1446a314"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.704587 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-scripts" (OuterVolumeSpecName: "scripts") pod "bfa52b24-007f-44f2-8bee-613f1446a314" (UID: "bfa52b24-007f-44f2-8bee-613f1446a314"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.724879 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa52b24-007f-44f2-8bee-613f1446a314-kube-api-access-pwhw8" (OuterVolumeSpecName: "kube-api-access-pwhw8") pod "bfa52b24-007f-44f2-8bee-613f1446a314" (UID: "bfa52b24-007f-44f2-8bee-613f1446a314"). InnerVolumeSpecName "kube-api-access-pwhw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.740677 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bfa52b24-007f-44f2-8bee-613f1446a314" (UID: "bfa52b24-007f-44f2-8bee-613f1446a314"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.791363 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfa52b24-007f-44f2-8bee-613f1446a314" (UID: "bfa52b24-007f-44f2-8bee-613f1446a314"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.798504 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.798536 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.798573 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.798582 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwhw8\" (UniqueName: \"kubernetes.io/projected/bfa52b24-007f-44f2-8bee-613f1446a314-kube-api-access-pwhw8\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.798590 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.798599 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bfa52b24-007f-44f2-8bee-613f1446a314-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.846193 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-config-data" (OuterVolumeSpecName: "config-data") pod "bfa52b24-007f-44f2-8bee-613f1446a314" (UID: "bfa52b24-007f-44f2-8bee-613f1446a314"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.901260 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfa52b24-007f-44f2-8bee-613f1446a314-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911052 4732 generic.go:334] "Generic (PLEG): container finished" podID="bfa52b24-007f-44f2-8bee-613f1446a314" containerID="841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" exitCode=0 Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911090 4732 generic.go:334] "Generic (PLEG): container finished" podID="bfa52b24-007f-44f2-8bee-613f1446a314" containerID="1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" exitCode=2 Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911098 4732 generic.go:334] "Generic (PLEG): container finished" podID="bfa52b24-007f-44f2-8bee-613f1446a314" containerID="ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" exitCode=0 Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911105 4732 generic.go:334] "Generic (PLEG): container finished" podID="bfa52b24-007f-44f2-8bee-613f1446a314" containerID="596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" exitCode=0 Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911122 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911198 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerDied","Data":"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac"} Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911291 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerDied","Data":"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03"} Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911303 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerDied","Data":"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d"} Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerDied","Data":"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563"} Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bfa52b24-007f-44f2-8bee-613f1446a314","Type":"ContainerDied","Data":"4d00702416a37be92d04b7ce31b62ea1c8ba43f1e34e08e9bb72725b9591d2f5"} Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.911343 4732 scope.go:117] "RemoveContainer" containerID="841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.916768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pjznz" event={"ID":"7f9aff96-840f-4c4c-8ae2-349dd76e614e","Type":"ContainerStarted","Data":"e8717780db17ff634dedc1118d77f3a0be22750ac0a7490e3e813d6089986154"} Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.916808 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pjznz" event={"ID":"7f9aff96-840f-4c4c-8ae2-349dd76e614e","Type":"ContainerStarted","Data":"86cab37c7a174ed98fa12a1fb36cb49c8bb6e20bb3c6e70961835ef320afbbc0"} Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.922141 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5kfkd"] Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.943338 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-pjznz" podStartSLOduration=1.9433210239999998 podStartE2EDuration="1.943321024s" podCreationTimestamp="2025-10-10 07:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:10:57.942481021 +0000 UTC m=+1185.012072272" watchObservedRunningTime="2025-10-10 07:10:57.943321024 +0000 UTC m=+1185.012912265" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.952457 4732 scope.go:117] "RemoveContainer" containerID="1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" Oct 10 07:10:57 crc kubenswrapper[4732]: I1010 07:10:57.974899 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.006191 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.016322 4732 scope.go:117] "RemoveContainer" containerID="ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017100 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.017563 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-central-agent" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017584 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-central-agent" Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.017599 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="proxy-httpd" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017607 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="proxy-httpd" Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.017625 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="sg-core" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017633 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="sg-core" Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.017654 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-notification-agent" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017662 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-notification-agent" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017910 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-notification-agent" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017934 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="sg-core" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017956 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="proxy-httpd" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.017971 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" containerName="ceilometer-central-agent" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.020063 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.023111 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.023351 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.048033 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-94k8g"] Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.056914 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.070345 4732 scope.go:117] "RemoveContainer" containerID="596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.090583 4732 scope.go:117] "RemoveContainer" containerID="841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.092230 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": container with ID starting with 841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac not found: ID does not exist" containerID="841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.092275 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac"} err="failed to get container status \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": rpc error: code = NotFound desc = could not find container \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": container with ID starting with 841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.092305 4732 scope.go:117] "RemoveContainer" containerID="1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.093047 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": container with ID starting with 1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03 not found: ID does not exist" containerID="1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.093080 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03"} err="failed to get container status \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": rpc error: code = NotFound desc = could not find container \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": container with ID starting with 1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.093104 4732 scope.go:117] "RemoveContainer" containerID="ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.093863 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": container with ID starting with ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d not found: ID does not exist" containerID="ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.093893 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d"} err="failed to get container status \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": rpc error: code = NotFound desc = could not find container \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": container with ID starting with ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.093912 4732 scope.go:117] "RemoveContainer" containerID="596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" Oct 10 07:10:58 crc kubenswrapper[4732]: E1010 07:10:58.094341 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": container with ID starting with 596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563 not found: ID does not exist" containerID="596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.094372 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563"} err="failed to get container status \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": rpc error: code = NotFound desc = could not find container \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": container with ID starting with 596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.094391 4732 scope.go:117] "RemoveContainer" containerID="841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.094767 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac"} err="failed to get container status \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": rpc error: code = NotFound desc = could not find container \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": container with ID starting with 841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.094793 4732 scope.go:117] "RemoveContainer" containerID="1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.095158 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03"} err="failed to get container status \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": rpc error: code = NotFound desc = could not find container \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": container with ID starting with 1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.095177 4732 scope.go:117] "RemoveContainer" containerID="ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.095685 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d"} err="failed to get container status \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": rpc error: code = NotFound desc = could not find container \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": container with ID starting with ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.095745 4732 scope.go:117] "RemoveContainer" containerID="596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096043 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563"} err="failed to get container status \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": rpc error: code = NotFound desc = could not find container \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": container with ID starting with 596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096064 4732 scope.go:117] "RemoveContainer" containerID="841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096295 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac"} err="failed to get container status \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": rpc error: code = NotFound desc = could not find container \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": container with ID starting with 841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096315 4732 scope.go:117] "RemoveContainer" containerID="1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096570 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03"} err="failed to get container status \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": rpc error: code = NotFound desc = could not find container \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": container with ID starting with 1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096592 4732 scope.go:117] "RemoveContainer" containerID="ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096933 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d"} err="failed to get container status \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": rpc error: code = NotFound desc = could not find container \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": container with ID starting with ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.096953 4732 scope.go:117] "RemoveContainer" containerID="596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.097168 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563"} err="failed to get container status \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": rpc error: code = NotFound desc = could not find container \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": container with ID starting with 596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.097193 4732 scope.go:117] "RemoveContainer" containerID="841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.097421 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac"} err="failed to get container status \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": rpc error: code = NotFound desc = could not find container \"841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac\": container with ID starting with 841a57ade525537f3a89a30e93905edda054bc1b4a882b8547bb6d913e466eac not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.097445 4732 scope.go:117] "RemoveContainer" containerID="1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.097937 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03"} err="failed to get container status \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": rpc error: code = NotFound desc = could not find container \"1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03\": container with ID starting with 1ca75c932a686d89e9cd53241345f20dadd37f075116027c6fce3591e2ecee03 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.098033 4732 scope.go:117] "RemoveContainer" containerID="ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.098433 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d"} err="failed to get container status \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": rpc error: code = NotFound desc = could not find container \"ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d\": container with ID starting with ca539581604622dc1582f6137b4189f30f3a50b203a04273dbde622d9ead047d not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.098457 4732 scope.go:117] "RemoveContainer" containerID="596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.099235 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563"} err="failed to get container status \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": rpc error: code = NotFound desc = could not find container \"596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563\": container with ID starting with 596f26d15b4325a5d1c8c5f7a4250533e51f0d3171cde088236ad7f2dc7f5563 not found: ID does not exist" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.105164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.105334 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-log-httpd\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.105470 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2jll\" (UniqueName: \"kubernetes.io/projected/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-kube-api-access-w2jll\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.105587 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-run-httpd\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.105671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.105797 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-scripts\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.105881 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-config-data\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.207338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.207395 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-log-httpd\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.207436 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2jll\" (UniqueName: \"kubernetes.io/projected/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-kube-api-access-w2jll\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.207452 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-run-httpd\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.207472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.207487 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-scripts\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.207507 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-config-data\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.208465 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-log-httpd\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.209262 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-run-httpd\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.212361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.212982 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.212994 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-config-data\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.213123 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-scripts\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.230825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2jll\" (UniqueName: \"kubernetes.io/projected/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-kube-api-access-w2jll\") pod \"ceilometer-0\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.377138 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.555734 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.555980 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-log" containerID="cri-o://fa2fe5ffad08eda01609d40a03daf2d04b8e09e8fdd5eedf039c44ae70aee915" gracePeriod=30 Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.556654 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-httpd" containerID="cri-o://3ecbe09d14b061b333b0a3c30f7012202070c01d6d13e5aeab8d8547767af628" gracePeriod=30 Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.880452 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:58 crc kubenswrapper[4732]: W1010 07:10:58.881122 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e62f0a3_8369_4068_ba8f_8d8a2937cd99.slice/crio-ee37e353405cffa380bf9989daf8eac321a5cd820d9018d6bec624aa5fb90de9 WatchSource:0}: Error finding container ee37e353405cffa380bf9989daf8eac321a5cd820d9018d6bec624aa5fb90de9: Status 404 returned error can't find the container with id ee37e353405cffa380bf9989daf8eac321a5cd820d9018d6bec624aa5fb90de9 Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.929568 4732 generic.go:334] "Generic (PLEG): container finished" podID="521796db-f8be-41d3-a251-3ba1101d99bc" containerID="a2fde98389bfd263442de180a19d9928051eed6c67f10e41cd4202a46c1c7e22" exitCode=0 Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.929618 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5kfkd" event={"ID":"521796db-f8be-41d3-a251-3ba1101d99bc","Type":"ContainerDied","Data":"a2fde98389bfd263442de180a19d9928051eed6c67f10e41cd4202a46c1c7e22"} Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.929662 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5kfkd" event={"ID":"521796db-f8be-41d3-a251-3ba1101d99bc","Type":"ContainerStarted","Data":"9adb141fc102ce64f5d52c45064cb6774170753c00a8ebaede645ece18e94852"} Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.930864 4732 generic.go:334] "Generic (PLEG): container finished" podID="20f43028-56f9-42d6-ad26-631d79465b65" containerID="1f3d13873970bd134f2543c565b451c32ae0541427673f4a084babf5197a72ef" exitCode=0 Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.930918 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-94k8g" event={"ID":"20f43028-56f9-42d6-ad26-631d79465b65","Type":"ContainerDied","Data":"1f3d13873970bd134f2543c565b451c32ae0541427673f4a084babf5197a72ef"} Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.930934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-94k8g" event={"ID":"20f43028-56f9-42d6-ad26-631d79465b65","Type":"ContainerStarted","Data":"a8239603df9757bd580ebabb52bae02559cd05077d0b84a8728759e1c5e514ac"} Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.933923 4732 generic.go:334] "Generic (PLEG): container finished" podID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerID="fa2fe5ffad08eda01609d40a03daf2d04b8e09e8fdd5eedf039c44ae70aee915" exitCode=143 Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.933972 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9","Type":"ContainerDied","Data":"fa2fe5ffad08eda01609d40a03daf2d04b8e09e8fdd5eedf039c44ae70aee915"} Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.935178 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerStarted","Data":"ee37e353405cffa380bf9989daf8eac321a5cd820d9018d6bec624aa5fb90de9"} Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.936425 4732 generic.go:334] "Generic (PLEG): container finished" podID="7f9aff96-840f-4c4c-8ae2-349dd76e614e" containerID="e8717780db17ff634dedc1118d77f3a0be22750ac0a7490e3e813d6089986154" exitCode=0 Oct 10 07:10:58 crc kubenswrapper[4732]: I1010 07:10:58.936462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pjznz" event={"ID":"7f9aff96-840f-4c4c-8ae2-349dd76e614e","Type":"ContainerDied","Data":"e8717780db17ff634dedc1118d77f3a0be22750ac0a7490e3e813d6089986154"} Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.238473 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.238773 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-log" containerID="cri-o://b7464fe56625fd8bdabbebfebe13747ad27ebcc9839bed3c455e68309f1b3a7d" gracePeriod=30 Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.238835 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-httpd" containerID="cri-o://f33cbdea2122f73beec7de26dbafdccd714f4862f702da77f5ed82dd6b03e431" gracePeriod=30 Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.432585 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.674327 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfa52b24-007f-44f2-8bee-613f1446a314" path="/var/lib/kubelet/pods/bfa52b24-007f-44f2-8bee-613f1446a314/volumes" Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.950474 4732 generic.go:334] "Generic (PLEG): container finished" podID="e300892d-faa1-4ba1-8095-04539bc33e27" containerID="b7464fe56625fd8bdabbebfebe13747ad27ebcc9839bed3c455e68309f1b3a7d" exitCode=143 Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.950573 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e300892d-faa1-4ba1-8095-04539bc33e27","Type":"ContainerDied","Data":"b7464fe56625fd8bdabbebfebe13747ad27ebcc9839bed3c455e68309f1b3a7d"} Oct 10 07:10:59 crc kubenswrapper[4732]: I1010 07:10:59.953809 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerStarted","Data":"96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb"} Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.341732 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjznz" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.389778 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kfkd" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.403129 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94k8g" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.468087 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47jzv\" (UniqueName: \"kubernetes.io/projected/7f9aff96-840f-4c4c-8ae2-349dd76e614e-kube-api-access-47jzv\") pod \"7f9aff96-840f-4c4c-8ae2-349dd76e614e\" (UID: \"7f9aff96-840f-4c4c-8ae2-349dd76e614e\") " Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.473857 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9aff96-840f-4c4c-8ae2-349dd76e614e-kube-api-access-47jzv" (OuterVolumeSpecName: "kube-api-access-47jzv") pod "7f9aff96-840f-4c4c-8ae2-349dd76e614e" (UID: "7f9aff96-840f-4c4c-8ae2-349dd76e614e"). InnerVolumeSpecName "kube-api-access-47jzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.569164 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc7ml\" (UniqueName: \"kubernetes.io/projected/20f43028-56f9-42d6-ad26-631d79465b65-kube-api-access-sc7ml\") pod \"20f43028-56f9-42d6-ad26-631d79465b65\" (UID: \"20f43028-56f9-42d6-ad26-631d79465b65\") " Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.569481 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psn2s\" (UniqueName: \"kubernetes.io/projected/521796db-f8be-41d3-a251-3ba1101d99bc-kube-api-access-psn2s\") pod \"521796db-f8be-41d3-a251-3ba1101d99bc\" (UID: \"521796db-f8be-41d3-a251-3ba1101d99bc\") " Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.570144 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47jzv\" (UniqueName: \"kubernetes.io/projected/7f9aff96-840f-4c4c-8ae2-349dd76e614e-kube-api-access-47jzv\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.572537 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f43028-56f9-42d6-ad26-631d79465b65-kube-api-access-sc7ml" (OuterVolumeSpecName: "kube-api-access-sc7ml") pod "20f43028-56f9-42d6-ad26-631d79465b65" (UID: "20f43028-56f9-42d6-ad26-631d79465b65"). InnerVolumeSpecName "kube-api-access-sc7ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.573149 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/521796db-f8be-41d3-a251-3ba1101d99bc-kube-api-access-psn2s" (OuterVolumeSpecName: "kube-api-access-psn2s") pod "521796db-f8be-41d3-a251-3ba1101d99bc" (UID: "521796db-f8be-41d3-a251-3ba1101d99bc"). InnerVolumeSpecName "kube-api-access-psn2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.672234 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psn2s\" (UniqueName: \"kubernetes.io/projected/521796db-f8be-41d3-a251-3ba1101d99bc-kube-api-access-psn2s\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.672441 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc7ml\" (UniqueName: \"kubernetes.io/projected/20f43028-56f9-42d6-ad26-631d79465b65-kube-api-access-sc7ml\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.973075 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerStarted","Data":"b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961"} Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.977972 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pjznz" event={"ID":"7f9aff96-840f-4c4c-8ae2-349dd76e614e","Type":"ContainerDied","Data":"86cab37c7a174ed98fa12a1fb36cb49c8bb6e20bb3c6e70961835ef320afbbc0"} Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.978012 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86cab37c7a174ed98fa12a1fb36cb49c8bb6e20bb3c6e70961835ef320afbbc0" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.978070 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjznz" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.982090 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5kfkd" event={"ID":"521796db-f8be-41d3-a251-3ba1101d99bc","Type":"ContainerDied","Data":"9adb141fc102ce64f5d52c45064cb6774170753c00a8ebaede645ece18e94852"} Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.982128 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adb141fc102ce64f5d52c45064cb6774170753c00a8ebaede645ece18e94852" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.982182 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5kfkd" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.988902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-94k8g" event={"ID":"20f43028-56f9-42d6-ad26-631d79465b65","Type":"ContainerDied","Data":"a8239603df9757bd580ebabb52bae02559cd05077d0b84a8728759e1c5e514ac"} Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.988929 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8239603df9757bd580ebabb52bae02559cd05077d0b84a8728759e1c5e514ac" Oct 10 07:11:00 crc kubenswrapper[4732]: I1010 07:11:00.988953 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-94k8g" Oct 10 07:11:02 crc kubenswrapper[4732]: I1010 07:11:02.008902 4732 generic.go:334] "Generic (PLEG): container finished" podID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerID="3ecbe09d14b061b333b0a3c30f7012202070c01d6d13e5aeab8d8547767af628" exitCode=0 Oct 10 07:11:02 crc kubenswrapper[4732]: I1010 07:11:02.008983 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9","Type":"ContainerDied","Data":"3ecbe09d14b061b333b0a3c30f7012202070c01d6d13e5aeab8d8547767af628"} Oct 10 07:11:03 crc kubenswrapper[4732]: I1010 07:11:03.904924 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.040091 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-public-tls-certs\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.040240 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjlfr\" (UniqueName: \"kubernetes.io/projected/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-kube-api-access-gjlfr\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.040269 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-config-data\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.041111 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-logs\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.041141 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-combined-ca-bundle\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.041158 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.041191 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-httpd-run\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.041296 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-scripts\") pod \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\" (UID: \"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.041467 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-logs" (OuterVolumeSpecName: "logs") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.041895 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.046048 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-kube-api-access-gjlfr" (OuterVolumeSpecName: "kube-api-access-gjlfr") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "kube-api-access-gjlfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.048199 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-scripts" (OuterVolumeSpecName: "scripts") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.048859 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.067665 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.069941 4732 generic.go:334] "Generic (PLEG): container finished" podID="e300892d-faa1-4ba1-8095-04539bc33e27" containerID="f33cbdea2122f73beec7de26dbafdccd714f4862f702da77f5ed82dd6b03e431" exitCode=0 Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.069986 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e300892d-faa1-4ba1-8095-04539bc33e27","Type":"ContainerDied","Data":"f33cbdea2122f73beec7de26dbafdccd714f4862f702da77f5ed82dd6b03e431"} Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.070317 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e300892d-faa1-4ba1-8095-04539bc33e27","Type":"ContainerDied","Data":"3369f97e9beba140b2e3d0aec0a446b524437c050aff712b8f90b3872dfc7076"} Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.070336 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3369f97e9beba140b2e3d0aec0a446b524437c050aff712b8f90b3872dfc7076" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.073089 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.073121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"15fceeba-43c8-485c-a3f9-a4a1d5c74ff9","Type":"ContainerDied","Data":"0423956a5b53d32f84a4c44e34d606b9f504f3b37c897a6b1e3cb6f852f7ad59"} Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.073195 4732 scope.go:117] "RemoveContainer" containerID="3ecbe09d14b061b333b0a3c30f7012202070c01d6d13e5aeab8d8547767af628" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.073640 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.075105 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerStarted","Data":"fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443"} Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.084843 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.099402 4732 scope.go:117] "RemoveContainer" containerID="fa2fe5ffad08eda01609d40a03daf2d04b8e09e8fdd5eedf039c44ae70aee915" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.123560 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.135718 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-config-data" (OuterVolumeSpecName: "config-data") pod "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" (UID: "15fceeba-43c8-485c-a3f9-a4a1d5c74ff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.143814 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.143847 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.143856 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.143866 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjlfr\" (UniqueName: \"kubernetes.io/projected/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-kube-api-access-gjlfr\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.143876 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.143886 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.143917 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.164183 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.244457 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.244550 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-scripts\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.244655 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-combined-ca-bundle\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.245067 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-logs\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.245298 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-config-data\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.245437 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4cqs\" (UniqueName: \"kubernetes.io/projected/e300892d-faa1-4ba1-8095-04539bc33e27-kube-api-access-v4cqs\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.245473 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-httpd-run\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.245530 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-internal-tls-certs\") pod \"e300892d-faa1-4ba1-8095-04539bc33e27\" (UID: \"e300892d-faa1-4ba1-8095-04539bc33e27\") " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.246517 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-logs" (OuterVolumeSpecName: "logs") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.246597 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.246719 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.248294 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.249728 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e300892d-faa1-4ba1-8095-04539bc33e27-kube-api-access-v4cqs" (OuterVolumeSpecName: "kube-api-access-v4cqs") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "kube-api-access-v4cqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.250279 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-scripts" (OuterVolumeSpecName: "scripts") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.287836 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.309851 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.316516 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-config-data" (OuterVolumeSpecName: "config-data") pod "e300892d-faa1-4ba1-8095-04539bc33e27" (UID: "e300892d-faa1-4ba1-8095-04539bc33e27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349017 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349050 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349063 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349096 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349107 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4cqs\" (UniqueName: \"kubernetes.io/projected/e300892d-faa1-4ba1-8095-04539bc33e27-kube-api-access-v4cqs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349116 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e300892d-faa1-4ba1-8095-04539bc33e27-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349124 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e300892d-faa1-4ba1-8095-04539bc33e27-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.349181 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.375570 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.409807 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.425664 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.433572 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:11:04 crc kubenswrapper[4732]: E1010 07:11:04.434047 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20f43028-56f9-42d6-ad26-631d79465b65" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434088 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f43028-56f9-42d6-ad26-631d79465b65" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: E1010 07:11:04.434105 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-httpd" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434111 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-httpd" Oct 10 07:11:04 crc kubenswrapper[4732]: E1010 07:11:04.434129 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-httpd" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434135 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-httpd" Oct 10 07:11:04 crc kubenswrapper[4732]: E1010 07:11:04.434147 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9aff96-840f-4c4c-8ae2-349dd76e614e" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434154 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9aff96-840f-4c4c-8ae2-349dd76e614e" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: E1010 07:11:04.434161 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="521796db-f8be-41d3-a251-3ba1101d99bc" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434167 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="521796db-f8be-41d3-a251-3ba1101d99bc" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: E1010 07:11:04.434177 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-log" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434184 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-log" Oct 10 07:11:04 crc kubenswrapper[4732]: E1010 07:11:04.434193 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-log" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434199 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-log" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434362 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-httpd" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434376 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-httpd" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434389 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9aff96-840f-4c4c-8ae2-349dd76e614e" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434398 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" containerName="glance-log" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434407 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" containerName="glance-log" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434418 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20f43028-56f9-42d6-ad26-631d79465b65" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.434429 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="521796db-f8be-41d3-a251-3ba1101d99bc" containerName="mariadb-database-create" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.435398 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.437798 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.438269 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.450268 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.451240 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559620 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559718 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-logs\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559794 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94gt\" (UniqueName: \"kubernetes.io/projected/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-kube-api-access-c94gt\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559814 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559878 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.559898 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663742 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-logs\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663811 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94gt\" (UniqueName: \"kubernetes.io/projected/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-kube-api-access-c94gt\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663834 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663854 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663889 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663907 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663943 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.663980 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.664396 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-logs\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.664487 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.664704 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.670635 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-config-data\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.675873 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.681667 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.682393 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-scripts\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.686984 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94gt\" (UniqueName: \"kubernetes.io/projected/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-kube-api-access-c94gt\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:04 crc kubenswrapper[4732]: I1010 07:11:04.757412 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " pod="openstack/glance-default-external-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.055642 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.093483 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.101132 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerStarted","Data":"6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9"} Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.101299 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.101309 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-central-agent" containerID="cri-o://96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb" gracePeriod=30 Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.101354 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="sg-core" containerID="cri-o://fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443" gracePeriod=30 Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.101412 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="proxy-httpd" containerID="cri-o://6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9" gracePeriod=30 Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.101459 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-notification-agent" containerID="cri-o://b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961" gracePeriod=30 Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.134927 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.163472657 podStartE2EDuration="8.134900248s" podCreationTimestamp="2025-10-10 07:10:57 +0000 UTC" firstStartedPulling="2025-10-10 07:10:58.88535327 +0000 UTC m=+1185.954944511" lastFinishedPulling="2025-10-10 07:11:04.856780861 +0000 UTC m=+1191.926372102" observedRunningTime="2025-10-10 07:11:05.132996266 +0000 UTC m=+1192.202587517" watchObservedRunningTime="2025-10-10 07:11:05.134900248 +0000 UTC m=+1192.204491489" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.183501 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.188039 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.213263 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.214738 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.222294 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.222566 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.233944 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279651 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986sk\" (UniqueName: \"kubernetes.io/projected/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-kube-api-access-986sk\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279742 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279775 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279799 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-logs\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279831 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279900 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279927 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.279942 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.381809 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382262 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382305 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382323 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382354 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986sk\" (UniqueName: \"kubernetes.io/projected/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-kube-api-access-986sk\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382396 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382428 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382449 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-logs\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382793 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.382862 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-logs\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.383098 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.387501 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.389498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.389586 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.397917 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.401684 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986sk\" (UniqueName: \"kubernetes.io/projected/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-kube-api-access-986sk\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.423043 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.567209 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.658995 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.679132 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fceeba-43c8-485c-a3f9-a4a1d5c74ff9" path="/var/lib/kubelet/pods/15fceeba-43c8-485c-a3f9-a4a1d5c74ff9/volumes" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.680145 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e300892d-faa1-4ba1-8095-04539bc33e27" path="/var/lib/kubelet/pods/e300892d-faa1-4ba1-8095-04539bc33e27/volumes" Oct 10 07:11:05 crc kubenswrapper[4732]: I1010 07:11:05.969998 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:11:05 crc kubenswrapper[4732]: W1010 07:11:05.979116 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda570b39e_7329_4bba_bfe0_cf5f7aa2269e.slice/crio-b8c9aa31e222078d4192908fa0e5d027440473db942e4d92c412271b5d5647e4 WatchSource:0}: Error finding container b8c9aa31e222078d4192908fa0e5d027440473db942e4d92c412271b5d5647e4: Status 404 returned error can't find the container with id b8c9aa31e222078d4192908fa0e5d027440473db942e4d92c412271b5d5647e4 Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.104132 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a570b39e-7329-4bba-bfe0-cf5f7aa2269e","Type":"ContainerStarted","Data":"b8c9aa31e222078d4192908fa0e5d027440473db942e4d92c412271b5d5647e4"} Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.110860 4732 generic.go:334] "Generic (PLEG): container finished" podID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerID="fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443" exitCode=2 Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.110895 4732 generic.go:334] "Generic (PLEG): container finished" podID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerID="b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961" exitCode=0 Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.110909 4732 generic.go:334] "Generic (PLEG): container finished" podID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerID="96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb" exitCode=0 Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.110965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerDied","Data":"fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443"} Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.110996 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerDied","Data":"b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961"} Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.111012 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerDied","Data":"96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb"} Oct 10 07:11:06 crc kubenswrapper[4732]: I1010 07:11:06.112746 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2","Type":"ContainerStarted","Data":"b74aa03cba2028b41de341da82af3ddd941b4c035fe7204eeb523e579a423b20"} Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.067058 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0b57-account-create-glphj"] Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.070370 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0b57-account-create-glphj" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.073268 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.087950 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0b57-account-create-glphj"] Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.122380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcx5p\" (UniqueName: \"kubernetes.io/projected/6d536f03-ccb0-4f7f-9d6a-8e2250557ecb-kube-api-access-pcx5p\") pod \"nova-api-0b57-account-create-glphj\" (UID: \"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb\") " pod="openstack/nova-api-0b57-account-create-glphj" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.127149 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2","Type":"ContainerStarted","Data":"d76c61903df35f0f8176003951cca022e8081f2049617d8550fff70d06901f35"} Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.127189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2","Type":"ContainerStarted","Data":"b888ac0b6d592ec667eb861df78abe425788617cf262c3af3800b0ec2cf59863"} Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.138175 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a570b39e-7329-4bba-bfe0-cf5f7aa2269e","Type":"ContainerStarted","Data":"f366ccf0fd7eff9163283eb01f40b778944fbee5750e2fdcbc35a6bd70d5f9a8"} Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.138214 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a570b39e-7329-4bba-bfe0-cf5f7aa2269e","Type":"ContainerStarted","Data":"58d341eb205d877224a5cb6a46597c2de42b015fa39f99b361d6a4d67ded1cc4"} Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.152048 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.152029588 podStartE2EDuration="3.152029588s" podCreationTimestamp="2025-10-10 07:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:07.147514505 +0000 UTC m=+1194.217105756" watchObservedRunningTime="2025-10-10 07:11:07.152029588 +0000 UTC m=+1194.221620819" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.179680 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.179658872 podStartE2EDuration="2.179658872s" podCreationTimestamp="2025-10-10 07:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:07.175800826 +0000 UTC m=+1194.245392087" watchObservedRunningTime="2025-10-10 07:11:07.179658872 +0000 UTC m=+1194.249250113" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.225981 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcx5p\" (UniqueName: \"kubernetes.io/projected/6d536f03-ccb0-4f7f-9d6a-8e2250557ecb-kube-api-access-pcx5p\") pod \"nova-api-0b57-account-create-glphj\" (UID: \"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb\") " pod="openstack/nova-api-0b57-account-create-glphj" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.268950 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcx5p\" (UniqueName: \"kubernetes.io/projected/6d536f03-ccb0-4f7f-9d6a-8e2250557ecb-kube-api-access-pcx5p\") pod \"nova-api-0b57-account-create-glphj\" (UID: \"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb\") " pod="openstack/nova-api-0b57-account-create-glphj" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.273551 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-0878-account-create-hdbsh"] Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.274683 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0878-account-create-hdbsh" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.276862 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.297472 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0878-account-create-hdbsh"] Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.330715 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd944\" (UniqueName: \"kubernetes.io/projected/f2fb6fd2-36fc-4a19-8462-f59d719b09d9-kube-api-access-bd944\") pod \"nova-cell0-0878-account-create-hdbsh\" (UID: \"f2fb6fd2-36fc-4a19-8462-f59d719b09d9\") " pod="openstack/nova-cell0-0878-account-create-hdbsh" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.395587 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0b57-account-create-glphj" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.432537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd944\" (UniqueName: \"kubernetes.io/projected/f2fb6fd2-36fc-4a19-8462-f59d719b09d9-kube-api-access-bd944\") pod \"nova-cell0-0878-account-create-hdbsh\" (UID: \"f2fb6fd2-36fc-4a19-8462-f59d719b09d9\") " pod="openstack/nova-cell0-0878-account-create-hdbsh" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.459879 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd944\" (UniqueName: \"kubernetes.io/projected/f2fb6fd2-36fc-4a19-8462-f59d719b09d9-kube-api-access-bd944\") pod \"nova-cell0-0878-account-create-hdbsh\" (UID: \"f2fb6fd2-36fc-4a19-8462-f59d719b09d9\") " pod="openstack/nova-cell0-0878-account-create-hdbsh" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.461614 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a8d0-account-create-fx6nt"] Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.462887 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a8d0-account-create-fx6nt" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.464747 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.469718 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a8d0-account-create-fx6nt"] Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.534753 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ttm\" (UniqueName: \"kubernetes.io/projected/c85cd845-7899-4892-be21-259881ff6ed5-kube-api-access-d7ttm\") pod \"nova-cell1-a8d0-account-create-fx6nt\" (UID: \"c85cd845-7899-4892-be21-259881ff6ed5\") " pod="openstack/nova-cell1-a8d0-account-create-fx6nt" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.634471 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0878-account-create-hdbsh" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.635741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ttm\" (UniqueName: \"kubernetes.io/projected/c85cd845-7899-4892-be21-259881ff6ed5-kube-api-access-d7ttm\") pod \"nova-cell1-a8d0-account-create-fx6nt\" (UID: \"c85cd845-7899-4892-be21-259881ff6ed5\") " pod="openstack/nova-cell1-a8d0-account-create-fx6nt" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.651581 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ttm\" (UniqueName: \"kubernetes.io/projected/c85cd845-7899-4892-be21-259881ff6ed5-kube-api-access-d7ttm\") pod \"nova-cell1-a8d0-account-create-fx6nt\" (UID: \"c85cd845-7899-4892-be21-259881ff6ed5\") " pod="openstack/nova-cell1-a8d0-account-create-fx6nt" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.813243 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a8d0-account-create-fx6nt" Oct 10 07:11:07 crc kubenswrapper[4732]: I1010 07:11:07.878783 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0b57-account-create-glphj"] Oct 10 07:11:07 crc kubenswrapper[4732]: W1010 07:11:07.881364 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d536f03_ccb0_4f7f_9d6a_8e2250557ecb.slice/crio-1338f64446143ad4cec93b015ae7043fa48566f595225ef293cf958ca319f0d0 WatchSource:0}: Error finding container 1338f64446143ad4cec93b015ae7043fa48566f595225ef293cf958ca319f0d0: Status 404 returned error can't find the container with id 1338f64446143ad4cec93b015ae7043fa48566f595225ef293cf958ca319f0d0 Oct 10 07:11:08 crc kubenswrapper[4732]: I1010 07:11:08.126682 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a8d0-account-create-fx6nt"] Oct 10 07:11:08 crc kubenswrapper[4732]: I1010 07:11:08.156180 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0878-account-create-hdbsh"] Oct 10 07:11:08 crc kubenswrapper[4732]: I1010 07:11:08.156881 4732 generic.go:334] "Generic (PLEG): container finished" podID="6d536f03-ccb0-4f7f-9d6a-8e2250557ecb" containerID="81bc0eabbce19d596497bbee4c0eaceeb22f5b1132be590874c75ea9e4b56d03" exitCode=0 Oct 10 07:11:08 crc kubenswrapper[4732]: I1010 07:11:08.158091 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0b57-account-create-glphj" event={"ID":"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb","Type":"ContainerDied","Data":"81bc0eabbce19d596497bbee4c0eaceeb22f5b1132be590874c75ea9e4b56d03"} Oct 10 07:11:08 crc kubenswrapper[4732]: I1010 07:11:08.158121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0b57-account-create-glphj" event={"ID":"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb","Type":"ContainerStarted","Data":"1338f64446143ad4cec93b015ae7043fa48566f595225ef293cf958ca319f0d0"} Oct 10 07:11:08 crc kubenswrapper[4732]: W1010 07:11:08.202229 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc85cd845_7899_4892_be21_259881ff6ed5.slice/crio-e8729e575a50885b704ac436adf3b973901276557a3e3139eb3b38a023172215 WatchSource:0}: Error finding container e8729e575a50885b704ac436adf3b973901276557a3e3139eb3b38a023172215: Status 404 returned error can't find the container with id e8729e575a50885b704ac436adf3b973901276557a3e3139eb3b38a023172215 Oct 10 07:11:08 crc kubenswrapper[4732]: W1010 07:11:08.202948 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2fb6fd2_36fc_4a19_8462_f59d719b09d9.slice/crio-7c612bdd000004a818371de002bd433d8d448de68e234da2f2f60100874c608f WatchSource:0}: Error finding container 7c612bdd000004a818371de002bd433d8d448de68e234da2f2f60100874c608f: Status 404 returned error can't find the container with id 7c612bdd000004a818371de002bd433d8d448de68e234da2f2f60100874c608f Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.172584 4732 generic.go:334] "Generic (PLEG): container finished" podID="c85cd845-7899-4892-be21-259881ff6ed5" containerID="3cd3567830dcb39ce650b51a0ea69fd5975608aac30a059595b9a8f437180072" exitCode=0 Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.172739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a8d0-account-create-fx6nt" event={"ID":"c85cd845-7899-4892-be21-259881ff6ed5","Type":"ContainerDied","Data":"3cd3567830dcb39ce650b51a0ea69fd5975608aac30a059595b9a8f437180072"} Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.172939 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a8d0-account-create-fx6nt" event={"ID":"c85cd845-7899-4892-be21-259881ff6ed5","Type":"ContainerStarted","Data":"e8729e575a50885b704ac436adf3b973901276557a3e3139eb3b38a023172215"} Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.174011 4732 generic.go:334] "Generic (PLEG): container finished" podID="f2fb6fd2-36fc-4a19-8462-f59d719b09d9" containerID="641a6cdf412db0e2310d7f541cc247d83efc9d202289b3873b100377259ddddd" exitCode=0 Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.174179 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0878-account-create-hdbsh" event={"ID":"f2fb6fd2-36fc-4a19-8462-f59d719b09d9","Type":"ContainerDied","Data":"641a6cdf412db0e2310d7f541cc247d83efc9d202289b3873b100377259ddddd"} Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.174193 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0878-account-create-hdbsh" event={"ID":"f2fb6fd2-36fc-4a19-8462-f59d719b09d9","Type":"ContainerStarted","Data":"7c612bdd000004a818371de002bd433d8d448de68e234da2f2f60100874c608f"} Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.554858 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0b57-account-create-glphj" Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.681221 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcx5p\" (UniqueName: \"kubernetes.io/projected/6d536f03-ccb0-4f7f-9d6a-8e2250557ecb-kube-api-access-pcx5p\") pod \"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb\" (UID: \"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb\") " Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.687448 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d536f03-ccb0-4f7f-9d6a-8e2250557ecb-kube-api-access-pcx5p" (OuterVolumeSpecName: "kube-api-access-pcx5p") pod "6d536f03-ccb0-4f7f-9d6a-8e2250557ecb" (UID: "6d536f03-ccb0-4f7f-9d6a-8e2250557ecb"). InnerVolumeSpecName "kube-api-access-pcx5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:09 crc kubenswrapper[4732]: I1010 07:11:09.783682 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcx5p\" (UniqueName: \"kubernetes.io/projected/6d536f03-ccb0-4f7f-9d6a-8e2250557ecb-kube-api-access-pcx5p\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.183972 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0b57-account-create-glphj" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.183968 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0b57-account-create-glphj" event={"ID":"6d536f03-ccb0-4f7f-9d6a-8e2250557ecb","Type":"ContainerDied","Data":"1338f64446143ad4cec93b015ae7043fa48566f595225ef293cf958ca319f0d0"} Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.184521 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1338f64446143ad4cec93b015ae7043fa48566f595225ef293cf958ca319f0d0" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.578269 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0878-account-create-hdbsh" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.594003 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a8d0-account-create-fx6nt" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.697858 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7ttm\" (UniqueName: \"kubernetes.io/projected/c85cd845-7899-4892-be21-259881ff6ed5-kube-api-access-d7ttm\") pod \"c85cd845-7899-4892-be21-259881ff6ed5\" (UID: \"c85cd845-7899-4892-be21-259881ff6ed5\") " Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.698256 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd944\" (UniqueName: \"kubernetes.io/projected/f2fb6fd2-36fc-4a19-8462-f59d719b09d9-kube-api-access-bd944\") pod \"f2fb6fd2-36fc-4a19-8462-f59d719b09d9\" (UID: \"f2fb6fd2-36fc-4a19-8462-f59d719b09d9\") " Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.704905 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2fb6fd2-36fc-4a19-8462-f59d719b09d9-kube-api-access-bd944" (OuterVolumeSpecName: "kube-api-access-bd944") pod "f2fb6fd2-36fc-4a19-8462-f59d719b09d9" (UID: "f2fb6fd2-36fc-4a19-8462-f59d719b09d9"). InnerVolumeSpecName "kube-api-access-bd944". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.705777 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85cd845-7899-4892-be21-259881ff6ed5-kube-api-access-d7ttm" (OuterVolumeSpecName: "kube-api-access-d7ttm") pod "c85cd845-7899-4892-be21-259881ff6ed5" (UID: "c85cd845-7899-4892-be21-259881ff6ed5"). InnerVolumeSpecName "kube-api-access-d7ttm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.801235 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd944\" (UniqueName: \"kubernetes.io/projected/f2fb6fd2-36fc-4a19-8462-f59d719b09d9-kube-api-access-bd944\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:10 crc kubenswrapper[4732]: I1010 07:11:10.801270 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7ttm\" (UniqueName: \"kubernetes.io/projected/c85cd845-7899-4892-be21-259881ff6ed5-kube-api-access-d7ttm\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:11 crc kubenswrapper[4732]: I1010 07:11:11.198774 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a8d0-account-create-fx6nt" event={"ID":"c85cd845-7899-4892-be21-259881ff6ed5","Type":"ContainerDied","Data":"e8729e575a50885b704ac436adf3b973901276557a3e3139eb3b38a023172215"} Oct 10 07:11:11 crc kubenswrapper[4732]: I1010 07:11:11.198887 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8729e575a50885b704ac436adf3b973901276557a3e3139eb3b38a023172215" Oct 10 07:11:11 crc kubenswrapper[4732]: I1010 07:11:11.198823 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a8d0-account-create-fx6nt" Oct 10 07:11:11 crc kubenswrapper[4732]: I1010 07:11:11.200521 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0878-account-create-hdbsh" event={"ID":"f2fb6fd2-36fc-4a19-8462-f59d719b09d9","Type":"ContainerDied","Data":"7c612bdd000004a818371de002bd433d8d448de68e234da2f2f60100874c608f"} Oct 10 07:11:11 crc kubenswrapper[4732]: I1010 07:11:11.200578 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c612bdd000004a818371de002bd433d8d448de68e234da2f2f60100874c608f" Oct 10 07:11:11 crc kubenswrapper[4732]: I1010 07:11:11.200620 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0878-account-create-hdbsh" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.513464 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vf62z"] Oct 10 07:11:12 crc kubenswrapper[4732]: E1010 07:11:12.514228 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85cd845-7899-4892-be21-259881ff6ed5" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.514243 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85cd845-7899-4892-be21-259881ff6ed5" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: E1010 07:11:12.514272 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2fb6fd2-36fc-4a19-8462-f59d719b09d9" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.514280 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2fb6fd2-36fc-4a19-8462-f59d719b09d9" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: E1010 07:11:12.514305 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d536f03-ccb0-4f7f-9d6a-8e2250557ecb" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.514312 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d536f03-ccb0-4f7f-9d6a-8e2250557ecb" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.514526 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85cd845-7899-4892-be21-259881ff6ed5" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.514543 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2fb6fd2-36fc-4a19-8462-f59d719b09d9" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.514560 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d536f03-ccb0-4f7f-9d6a-8e2250557ecb" containerName="mariadb-account-create" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.515226 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.518341 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lkm69" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.518378 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.518341 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.535741 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vf62z"] Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.634660 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-config-data\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.634929 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skpct\" (UniqueName: \"kubernetes.io/projected/f6767f74-220c-4299-ad0a-a12dcc2d7e24-kube-api-access-skpct\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.635074 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-scripts\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.635126 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.737069 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.737171 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-config-data\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.737226 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skpct\" (UniqueName: \"kubernetes.io/projected/f6767f74-220c-4299-ad0a-a12dcc2d7e24-kube-api-access-skpct\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.737278 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-scripts\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.744160 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-scripts\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.744496 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-config-data\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.746138 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.756518 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skpct\" (UniqueName: \"kubernetes.io/projected/f6767f74-220c-4299-ad0a-a12dcc2d7e24-kube-api-access-skpct\") pod \"nova-cell0-conductor-db-sync-vf62z\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:12 crc kubenswrapper[4732]: I1010 07:11:12.835413 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:13 crc kubenswrapper[4732]: I1010 07:11:13.308816 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vf62z"] Oct 10 07:11:14 crc kubenswrapper[4732]: I1010 07:11:14.225935 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vf62z" event={"ID":"f6767f74-220c-4299-ad0a-a12dcc2d7e24","Type":"ContainerStarted","Data":"9c37984554548c5df1261a8f8f2eefe8f3e571f18968fb5d73dc6580f9763adf"} Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.056901 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.057250 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.094443 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.117565 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.237024 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.238269 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.568039 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.568086 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.604702 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:15 crc kubenswrapper[4732]: I1010 07:11:15.610327 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:16 crc kubenswrapper[4732]: I1010 07:11:16.243456 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:16 crc kubenswrapper[4732]: I1010 07:11:16.243499 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:17 crc kubenswrapper[4732]: I1010 07:11:17.252091 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 07:11:17 crc kubenswrapper[4732]: I1010 07:11:17.252418 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 07:11:17 crc kubenswrapper[4732]: I1010 07:11:17.358965 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 07:11:17 crc kubenswrapper[4732]: I1010 07:11:17.361613 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 07:11:18 crc kubenswrapper[4732]: I1010 07:11:18.116448 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:18 crc kubenswrapper[4732]: I1010 07:11:18.162218 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 07:11:21 crc kubenswrapper[4732]: I1010 07:11:21.310605 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vf62z" event={"ID":"f6767f74-220c-4299-ad0a-a12dcc2d7e24","Type":"ContainerStarted","Data":"8b20c40d1dd65dff94ce2787bef4e91341b5943f617e24c08f9f51746adf32f3"} Oct 10 07:11:21 crc kubenswrapper[4732]: I1010 07:11:21.331344 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vf62z" podStartSLOduration=2.582789925 podStartE2EDuration="9.331318294s" podCreationTimestamp="2025-10-10 07:11:12 +0000 UTC" firstStartedPulling="2025-10-10 07:11:13.316375511 +0000 UTC m=+1200.385966752" lastFinishedPulling="2025-10-10 07:11:20.06490387 +0000 UTC m=+1207.134495121" observedRunningTime="2025-10-10 07:11:21.326752249 +0000 UTC m=+1208.396343500" watchObservedRunningTime="2025-10-10 07:11:21.331318294 +0000 UTC m=+1208.400909545" Oct 10 07:11:28 crc kubenswrapper[4732]: I1010 07:11:28.382995 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 10 07:11:31 crc kubenswrapper[4732]: I1010 07:11:31.410154 4732 generic.go:334] "Generic (PLEG): container finished" podID="f6767f74-220c-4299-ad0a-a12dcc2d7e24" containerID="8b20c40d1dd65dff94ce2787bef4e91341b5943f617e24c08f9f51746adf32f3" exitCode=0 Oct 10 07:11:31 crc kubenswrapper[4732]: I1010 07:11:31.410280 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vf62z" event={"ID":"f6767f74-220c-4299-ad0a-a12dcc2d7e24","Type":"ContainerDied","Data":"8b20c40d1dd65dff94ce2787bef4e91341b5943f617e24c08f9f51746adf32f3"} Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.771859 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.829612 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-config-data\") pod \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.829811 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skpct\" (UniqueName: \"kubernetes.io/projected/f6767f74-220c-4299-ad0a-a12dcc2d7e24-kube-api-access-skpct\") pod \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.830054 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-combined-ca-bundle\") pod \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.830163 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-scripts\") pod \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\" (UID: \"f6767f74-220c-4299-ad0a-a12dcc2d7e24\") " Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.837149 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6767f74-220c-4299-ad0a-a12dcc2d7e24-kube-api-access-skpct" (OuterVolumeSpecName: "kube-api-access-skpct") pod "f6767f74-220c-4299-ad0a-a12dcc2d7e24" (UID: "f6767f74-220c-4299-ad0a-a12dcc2d7e24"). InnerVolumeSpecName "kube-api-access-skpct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.848994 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-scripts" (OuterVolumeSpecName: "scripts") pod "f6767f74-220c-4299-ad0a-a12dcc2d7e24" (UID: "f6767f74-220c-4299-ad0a-a12dcc2d7e24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.918195 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-config-data" (OuterVolumeSpecName: "config-data") pod "f6767f74-220c-4299-ad0a-a12dcc2d7e24" (UID: "f6767f74-220c-4299-ad0a-a12dcc2d7e24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.929991 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6767f74-220c-4299-ad0a-a12dcc2d7e24" (UID: "f6767f74-220c-4299-ad0a-a12dcc2d7e24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.935796 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.935837 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skpct\" (UniqueName: \"kubernetes.io/projected/f6767f74-220c-4299-ad0a-a12dcc2d7e24-kube-api-access-skpct\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.935854 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:32 crc kubenswrapper[4732]: I1010 07:11:32.935864 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6767f74-220c-4299-ad0a-a12dcc2d7e24-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.441824 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vf62z" event={"ID":"f6767f74-220c-4299-ad0a-a12dcc2d7e24","Type":"ContainerDied","Data":"9c37984554548c5df1261a8f8f2eefe8f3e571f18968fb5d73dc6580f9763adf"} Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.441938 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c37984554548c5df1261a8f8f2eefe8f3e571f18968fb5d73dc6580f9763adf" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.442113 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vf62z" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.574549 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:11:33 crc kubenswrapper[4732]: E1010 07:11:33.575048 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6767f74-220c-4299-ad0a-a12dcc2d7e24" containerName="nova-cell0-conductor-db-sync" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.575076 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6767f74-220c-4299-ad0a-a12dcc2d7e24" containerName="nova-cell0-conductor-db-sync" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.575336 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6767f74-220c-4299-ad0a-a12dcc2d7e24" containerName="nova-cell0-conductor-db-sync" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.576142 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.578251 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.584941 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-lkm69" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.595625 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.650832 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.650905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4jjq\" (UniqueName: \"kubernetes.io/projected/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-kube-api-access-x4jjq\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.651069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.753002 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.753501 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4jjq\" (UniqueName: \"kubernetes.io/projected/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-kube-api-access-x4jjq\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.753636 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.758766 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.761536 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.778028 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4jjq\" (UniqueName: \"kubernetes.io/projected/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-kube-api-access-x4jjq\") pod \"nova-cell0-conductor-0\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:33 crc kubenswrapper[4732]: I1010 07:11:33.910628 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:34 crc kubenswrapper[4732]: I1010 07:11:34.358071 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:11:34 crc kubenswrapper[4732]: I1010 07:11:34.454932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"13bb7b78-cc62-4d3b-a33a-9af77ee9e141","Type":"ContainerStarted","Data":"40b44a9fe0f27c82f8b65e4c05eb68c80f0b87a6c122db473eb69ca84fcd1ead"} Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.458835 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.465063 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"13bb7b78-cc62-4d3b-a33a-9af77ee9e141","Type":"ContainerStarted","Data":"f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c"} Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.465120 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.466778 4732 generic.go:334] "Generic (PLEG): container finished" podID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerID="6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9" exitCode=137 Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.466811 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerDied","Data":"6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9"} Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.466830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e62f0a3-8369-4068-ba8f-8d8a2937cd99","Type":"ContainerDied","Data":"ee37e353405cffa380bf9989daf8eac321a5cd820d9018d6bec624aa5fb90de9"} Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.466849 4732 scope.go:117] "RemoveContainer" containerID="6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.466940 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.486976 4732 scope.go:117] "RemoveContainer" containerID="fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.520040 4732 scope.go:117] "RemoveContainer" containerID="b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.526587 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.526569935 podStartE2EDuration="2.526569935s" podCreationTimestamp="2025-10-10 07:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:35.520397667 +0000 UTC m=+1222.589988918" watchObservedRunningTime="2025-10-10 07:11:35.526569935 +0000 UTC m=+1222.596161176" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.546383 4732 scope.go:117] "RemoveContainer" containerID="96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.578271 4732 scope.go:117] "RemoveContainer" containerID="6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9" Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.579045 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9\": container with ID starting with 6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9 not found: ID does not exist" containerID="6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.579073 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9"} err="failed to get container status \"6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9\": rpc error: code = NotFound desc = could not find container \"6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9\": container with ID starting with 6ee7b0e96684fa9c801fb40014265532b59e30a718c7c342d1eb252c999dcef9 not found: ID does not exist" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.579115 4732 scope.go:117] "RemoveContainer" containerID="fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443" Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.579375 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443\": container with ID starting with fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443 not found: ID does not exist" containerID="fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.579394 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443"} err="failed to get container status \"fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443\": rpc error: code = NotFound desc = could not find container \"fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443\": container with ID starting with fb32768730365f0570dd453eba2302dd55d8db693144b74e23f7c44decba7443 not found: ID does not exist" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.579406 4732 scope.go:117] "RemoveContainer" containerID="b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961" Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.579630 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961\": container with ID starting with b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961 not found: ID does not exist" containerID="b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.579646 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961"} err="failed to get container status \"b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961\": rpc error: code = NotFound desc = could not find container \"b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961\": container with ID starting with b56c811fe126f7e31d62af7b66599830ed41a65f70ae015acaffbad772382961 not found: ID does not exist" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.579660 4732 scope.go:117] "RemoveContainer" containerID="96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb" Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.580316 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb\": container with ID starting with 96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb not found: ID does not exist" containerID="96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.580352 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb"} err="failed to get container status \"96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb\": rpc error: code = NotFound desc = could not find container \"96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb\": container with ID starting with 96e5d7da8674315a9ed8aaf0d3e6d0b40fcbc219150b05f4e290a8579a91b9bb not found: ID does not exist" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.580634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2jll\" (UniqueName: \"kubernetes.io/projected/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-kube-api-access-w2jll\") pod \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.580749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-run-httpd\") pod \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.580766 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-config-data\") pod \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.580895 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-sg-core-conf-yaml\") pod \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.580946 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-log-httpd\") pod \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.580988 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-scripts\") pod \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.581032 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-combined-ca-bundle\") pod \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\" (UID: \"9e62f0a3-8369-4068-ba8f-8d8a2937cd99\") " Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.582759 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9e62f0a3-8369-4068-ba8f-8d8a2937cd99" (UID: "9e62f0a3-8369-4068-ba8f-8d8a2937cd99"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.584215 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9e62f0a3-8369-4068-ba8f-8d8a2937cd99" (UID: "9e62f0a3-8369-4068-ba8f-8d8a2937cd99"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.588108 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-scripts" (OuterVolumeSpecName: "scripts") pod "9e62f0a3-8369-4068-ba8f-8d8a2937cd99" (UID: "9e62f0a3-8369-4068-ba8f-8d8a2937cd99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.588900 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-kube-api-access-w2jll" (OuterVolumeSpecName: "kube-api-access-w2jll") pod "9e62f0a3-8369-4068-ba8f-8d8a2937cd99" (UID: "9e62f0a3-8369-4068-ba8f-8d8a2937cd99"). InnerVolumeSpecName "kube-api-access-w2jll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.616209 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9e62f0a3-8369-4068-ba8f-8d8a2937cd99" (UID: "9e62f0a3-8369-4068-ba8f-8d8a2937cd99"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.684615 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.684674 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.684741 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.684880 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2jll\" (UniqueName: \"kubernetes.io/projected/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-kube-api-access-w2jll\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.684911 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.704229 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-config-data" (OuterVolumeSpecName: "config-data") pod "9e62f0a3-8369-4068-ba8f-8d8a2937cd99" (UID: "9e62f0a3-8369-4068-ba8f-8d8a2937cd99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.713354 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e62f0a3-8369-4068-ba8f-8d8a2937cd99" (UID: "9e62f0a3-8369-4068-ba8f-8d8a2937cd99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.787155 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.787185 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e62f0a3-8369-4068-ba8f-8d8a2937cd99-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.832454 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.844329 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853015 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.853431 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="sg-core" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853452 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="sg-core" Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.853469 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-central-agent" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853476 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-central-agent" Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.853496 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="proxy-httpd" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853503 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="proxy-httpd" Oct 10 07:11:35 crc kubenswrapper[4732]: E1010 07:11:35.853534 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-notification-agent" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853542 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-notification-agent" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853759 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-notification-agent" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853777 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="ceilometer-central-agent" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853799 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="sg-core" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.853816 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" containerName="proxy-httpd" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.855738 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.858163 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.859193 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.874234 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.995962 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rx8v\" (UniqueName: \"kubernetes.io/projected/c073656d-ced9-4557-b243-6436287d45f2-kube-api-access-9rx8v\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.996044 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.996102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-scripts\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.996129 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-config-data\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.996169 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-run-httpd\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.996252 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-log-httpd\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:35 crc kubenswrapper[4732]: I1010 07:11:35.996276 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.099706 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-scripts\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.099754 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-config-data\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.099814 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-run-httpd\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.099897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-log-httpd\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.099922 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.099987 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rx8v\" (UniqueName: \"kubernetes.io/projected/c073656d-ced9-4557-b243-6436287d45f2-kube-api-access-9rx8v\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.100025 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.100539 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-log-httpd\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.100784 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-run-httpd\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.103670 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.103863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-config-data\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.104019 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.104375 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-scripts\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.129150 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rx8v\" (UniqueName: \"kubernetes.io/projected/c073656d-ced9-4557-b243-6436287d45f2-kube-api-access-9rx8v\") pod \"ceilometer-0\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.175774 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:11:36 crc kubenswrapper[4732]: I1010 07:11:36.643999 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:11:37 crc kubenswrapper[4732]: I1010 07:11:37.485531 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerStarted","Data":"f749aac1c3128e8bb73d2aea323ce1c264d3e4f6ebaa237a4a0da2e0f2f8d619"} Oct 10 07:11:37 crc kubenswrapper[4732]: I1010 07:11:37.485870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerStarted","Data":"81132a6f8574d0095bd62dd92ba9069617837d9297212f13a14e88b5b841ad0c"} Oct 10 07:11:37 crc kubenswrapper[4732]: I1010 07:11:37.669379 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e62f0a3-8369-4068-ba8f-8d8a2937cd99" path="/var/lib/kubelet/pods/9e62f0a3-8369-4068-ba8f-8d8a2937cd99/volumes" Oct 10 07:11:38 crc kubenswrapper[4732]: I1010 07:11:38.495512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerStarted","Data":"6ae7fb61d646953cb25b72f435163da67333a8e5be9470f635bfae752e17fbf3"} Oct 10 07:11:39 crc kubenswrapper[4732]: I1010 07:11:39.507329 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerStarted","Data":"f352ed4bc8699274befd0304977aa60b3e0b9612209d1bd2bfafdb028382f639"} Oct 10 07:11:41 crc kubenswrapper[4732]: I1010 07:11:41.526644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerStarted","Data":"36ca83580adb755b0ec73cc7a25c4dd14e2a30895334d3561ec2e969937e858a"} Oct 10 07:11:41 crc kubenswrapper[4732]: I1010 07:11:41.527245 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 07:11:41 crc kubenswrapper[4732]: I1010 07:11:41.558589 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.000391344 podStartE2EDuration="6.55857203s" podCreationTimestamp="2025-10-10 07:11:35 +0000 UTC" firstStartedPulling="2025-10-10 07:11:36.640142301 +0000 UTC m=+1223.709733542" lastFinishedPulling="2025-10-10 07:11:40.198322987 +0000 UTC m=+1227.267914228" observedRunningTime="2025-10-10 07:11:41.552230777 +0000 UTC m=+1228.621822018" watchObservedRunningTime="2025-10-10 07:11:41.55857203 +0000 UTC m=+1228.628163261" Oct 10 07:11:43 crc kubenswrapper[4732]: I1010 07:11:43.947436 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.581895 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vn6nz"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.583164 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.585877 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.586024 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.591968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vn6nz"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.651598 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-config-data\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.651958 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-scripts\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.651994 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbczf\" (UniqueName: \"kubernetes.io/projected/95d185c7-30e1-4efc-acd8-5ce5b3784a47-kube-api-access-tbczf\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.652028 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.753307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-scripts\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.753370 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbczf\" (UniqueName: \"kubernetes.io/projected/95d185c7-30e1-4efc-acd8-5ce5b3784a47-kube-api-access-tbczf\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.753409 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.753515 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-config-data\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.761751 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.782220 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-scripts\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.782673 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-config-data\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.789416 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbczf\" (UniqueName: \"kubernetes.io/projected/95d185c7-30e1-4efc-acd8-5ce5b3784a47-kube-api-access-tbczf\") pod \"nova-cell0-cell-mapping-vn6nz\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.798038 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.799877 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.802337 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.805319 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.824138 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.825414 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.834449 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.857800 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-config-data\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.858045 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.858078 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67321190-8765-49cf-98b9-3dbd3c6b45cb-logs\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.858141 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27fl\" (UniqueName: \"kubernetes.io/projected/67321190-8765-49cf-98b9-3dbd3c6b45cb-kube-api-access-c27fl\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.861006 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.910032 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.911461 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.913964 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.928984 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.937904 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.956424 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.957536 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.968032 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.969578 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27fl\" (UniqueName: \"kubernetes.io/projected/67321190-8765-49cf-98b9-3dbd3c6b45cb-kube-api-access-c27fl\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.969626 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.969643 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014559c5-0f9e-40f2-9970-8aa80bace48b-logs\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.969666 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-config-data\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.969710 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/1663f81e-0a18-486e-bbe9-4bb332fd5af7-kube-api-access-pbjck\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.969729 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-config-data\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.971107 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbx8h\" (UniqueName: \"kubernetes.io/projected/014559c5-0f9e-40f2-9970-8aa80bace48b-kube-api-access-gbx8h\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.971184 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-config-data\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.971219 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.971274 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67321190-8765-49cf-98b9-3dbd3c6b45cb-logs\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.971385 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.977013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67321190-8765-49cf-98b9-3dbd3c6b45cb-logs\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:44 crc kubenswrapper[4732]: I1010 07:11:44.982627 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-config-data\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.010311 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.038930 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27fl\" (UniqueName: \"kubernetes.io/projected/67321190-8765-49cf-98b9-3dbd3c6b45cb-kube-api-access-c27fl\") pod \"nova-api-0\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " pod="openstack/nova-api-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.074504 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b7586c88c-6ch46"] Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.075990 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076098 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076200 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076276 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014559c5-0f9e-40f2-9970-8aa80bace48b-logs\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076357 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-config-data\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076439 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/1663f81e-0a18-486e-bbe9-4bb332fd5af7-kube-api-access-pbjck\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076515 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-config-data\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076590 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbx8h\" (UniqueName: \"kubernetes.io/projected/014559c5-0f9e-40f2-9970-8aa80bace48b-kube-api-access-gbx8h\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076852 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tprkn\" (UniqueName: \"kubernetes.io/projected/e7bf1479-c766-4541-8b5d-38b98d8929b7-kube-api-access-tprkn\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.082649 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014559c5-0f9e-40f2-9970-8aa80bace48b-logs\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.087104 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-config-data\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.076014 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.087583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-config-data\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.087902 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.091624 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.106765 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.126294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbx8h\" (UniqueName: \"kubernetes.io/projected/014559c5-0f9e-40f2-9970-8aa80bace48b-kube-api-access-gbx8h\") pod \"nova-metadata-0\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.126811 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/1663f81e-0a18-486e-bbe9-4bb332fd5af7-kube-api-access-pbjck\") pod \"nova-scheduler-0\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " pod="openstack/nova-scheduler-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.136587 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b7586c88c-6ch46"] Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.180853 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181077 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2srjq\" (UniqueName: \"kubernetes.io/projected/a634b425-c334-4ffb-9ea0-f8deab7c9b00-kube-api-access-2srjq\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181254 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181295 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-svc\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-sb\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181367 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-swift-storage-0\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181387 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tprkn\" (UniqueName: \"kubernetes.io/projected/e7bf1479-c766-4541-8b5d-38b98d8929b7-kube-api-access-tprkn\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181441 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-nb\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.181462 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-config\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.190774 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.192431 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.210457 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.217708 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tprkn\" (UniqueName: \"kubernetes.io/projected/e7bf1479-c766-4541-8b5d-38b98d8929b7-kube-api-access-tprkn\") pod \"nova-cell1-novncproxy-0\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.222917 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.256074 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.286684 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-svc\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.286749 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-sb\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.286805 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-swift-storage-0\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.286869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-nb\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.286892 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-config\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.286936 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2srjq\" (UniqueName: \"kubernetes.io/projected/a634b425-c334-4ffb-9ea0-f8deab7c9b00-kube-api-access-2srjq\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.288728 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-svc\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.288822 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-sb\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.289645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-nb\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.290497 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-swift-storage-0\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.291867 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-config\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.305377 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2srjq\" (UniqueName: \"kubernetes.io/projected/a634b425-c334-4ffb-9ea0-f8deab7c9b00-kube-api-access-2srjq\") pod \"dnsmasq-dns-7b7586c88c-6ch46\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.479325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.497708 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.556024 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vn6nz"] Oct 10 07:11:45 crc kubenswrapper[4732]: W1010 07:11:45.560966 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d185c7_30e1_4efc_acd8_5ce5b3784a47.slice/crio-c3f157520402bcbd0f4b0216402b279edd621ebf3d2554957745d188a90e673e WatchSource:0}: Error finding container c3f157520402bcbd0f4b0216402b279edd621ebf3d2554957745d188a90e673e: Status 404 returned error can't find the container with id c3f157520402bcbd0f4b0216402b279edd621ebf3d2554957745d188a90e673e Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.831251 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c29q6"] Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.848778 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c29q6"] Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.852926 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.862146 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.863654 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.902274 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.959336 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.963051 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-scripts\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.963102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm58p\" (UniqueName: \"kubernetes.io/projected/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-kube-api-access-hm58p\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.963149 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.963188 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-config-data\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:45 crc kubenswrapper[4732]: I1010 07:11:45.972338 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.064807 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-config-data\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.065280 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-scripts\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.065328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm58p\" (UniqueName: \"kubernetes.io/projected/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-kube-api-access-hm58p\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.065379 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.075863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.076354 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-scripts\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.082211 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-config-data\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.086293 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm58p\" (UniqueName: \"kubernetes.io/projected/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-kube-api-access-hm58p\") pod \"nova-cell1-conductor-db-sync-c29q6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.230891 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.268114 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b7586c88c-6ch46"] Oct 10 07:11:46 crc kubenswrapper[4732]: W1010 07:11:46.285667 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda634b425_c334_4ffb_9ea0_f8deab7c9b00.slice/crio-b7ea7ed5793e11dbda6361a46c75fca0201ee91ad9487007cc7a05b4c0e37a60 WatchSource:0}: Error finding container b7ea7ed5793e11dbda6361a46c75fca0201ee91ad9487007cc7a05b4c0e37a60: Status 404 returned error can't find the container with id b7ea7ed5793e11dbda6361a46c75fca0201ee91ad9487007cc7a05b4c0e37a60 Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.359225 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.576417 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" event={"ID":"a634b425-c334-4ffb-9ea0-f8deab7c9b00","Type":"ContainerStarted","Data":"0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.576457 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" event={"ID":"a634b425-c334-4ffb-9ea0-f8deab7c9b00","Type":"ContainerStarted","Data":"b7ea7ed5793e11dbda6361a46c75fca0201ee91ad9487007cc7a05b4c0e37a60"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.587235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1663f81e-0a18-486e-bbe9-4bb332fd5af7","Type":"ContainerStarted","Data":"ad61b65581198be2df4b45c74159704c5a185e22b3c6e33c83911014e131fd34"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.588591 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vn6nz" event={"ID":"95d185c7-30e1-4efc-acd8-5ce5b3784a47","Type":"ContainerStarted","Data":"04b3bf834ce0e2c1ec7e80b92fd88bfde1cce05f5dc166ae8556a77592cda914"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.588613 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vn6nz" event={"ID":"95d185c7-30e1-4efc-acd8-5ce5b3784a47","Type":"ContainerStarted","Data":"c3f157520402bcbd0f4b0216402b279edd621ebf3d2554957745d188a90e673e"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.601334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"014559c5-0f9e-40f2-9970-8aa80bace48b","Type":"ContainerStarted","Data":"2af16c11196e6e0fe1c1f9da54304793744a7162b255fafcc152ea346d2b2ac9"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.602567 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7bf1479-c766-4541-8b5d-38b98d8929b7","Type":"ContainerStarted","Data":"adadffc3791667c1f304b68e0d061a1502226cb6c35d542977770dc7c7a31881"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.605773 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67321190-8765-49cf-98b9-3dbd3c6b45cb","Type":"ContainerStarted","Data":"1c81f151660b29eb0b011ed8ae699aeed979ac4daf2d7b28ed76c9f55f4bddd9"} Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.621015 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vn6nz" podStartSLOduration=2.6210002489999997 podStartE2EDuration="2.621000249s" podCreationTimestamp="2025-10-10 07:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:46.617077522 +0000 UTC m=+1233.686668763" watchObservedRunningTime="2025-10-10 07:11:46.621000249 +0000 UTC m=+1233.690591490" Oct 10 07:11:46 crc kubenswrapper[4732]: I1010 07:11:46.785158 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c29q6"] Oct 10 07:11:47 crc kubenswrapper[4732]: I1010 07:11:47.622298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c29q6" event={"ID":"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6","Type":"ContainerStarted","Data":"37cc9165f0d1920ae33dae00b95b24914cdd3fbd5dce1ff153580c83b03a2d77"} Oct 10 07:11:47 crc kubenswrapper[4732]: I1010 07:11:47.622562 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c29q6" event={"ID":"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6","Type":"ContainerStarted","Data":"dc8fc5c6aa6fa2d3687b7232a781eff61538f4871c9cd8e9836037bb7bd0fa04"} Oct 10 07:11:47 crc kubenswrapper[4732]: I1010 07:11:47.624750 4732 generic.go:334] "Generic (PLEG): container finished" podID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerID="0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97" exitCode=0 Oct 10 07:11:47 crc kubenswrapper[4732]: I1010 07:11:47.625721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" event={"ID":"a634b425-c334-4ffb-9ea0-f8deab7c9b00","Type":"ContainerDied","Data":"0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97"} Oct 10 07:11:47 crc kubenswrapper[4732]: I1010 07:11:47.642548 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-c29q6" podStartSLOduration=2.642529764 podStartE2EDuration="2.642529764s" podCreationTimestamp="2025-10-10 07:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:47.637856917 +0000 UTC m=+1234.707448168" watchObservedRunningTime="2025-10-10 07:11:47.642529764 +0000 UTC m=+1234.712121005" Oct 10 07:11:48 crc kubenswrapper[4732]: I1010 07:11:48.381909 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:11:48 crc kubenswrapper[4732]: I1010 07:11:48.393831 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.645955 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1663f81e-0a18-486e-bbe9-4bb332fd5af7","Type":"ContainerStarted","Data":"073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841"} Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.648923 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-log" containerID="cri-o://db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f" gracePeriod=30 Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.649467 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"014559c5-0f9e-40f2-9970-8aa80bace48b","Type":"ContainerStarted","Data":"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9"} Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.649559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"014559c5-0f9e-40f2-9970-8aa80bace48b","Type":"ContainerStarted","Data":"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f"} Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.650172 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-metadata" containerID="cri-o://1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9" gracePeriod=30 Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.658166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67321190-8765-49cf-98b9-3dbd3c6b45cb","Type":"ContainerStarted","Data":"517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e"} Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.658222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67321190-8765-49cf-98b9-3dbd3c6b45cb","Type":"ContainerStarted","Data":"71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb"} Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.680381 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.856519451 podStartE2EDuration="5.680362383s" podCreationTimestamp="2025-10-10 07:11:44 +0000 UTC" firstStartedPulling="2025-10-10 07:11:45.901575491 +0000 UTC m=+1232.971166732" lastFinishedPulling="2025-10-10 07:11:48.725418423 +0000 UTC m=+1235.795009664" observedRunningTime="2025-10-10 07:11:49.673145546 +0000 UTC m=+1236.742736797" watchObservedRunningTime="2025-10-10 07:11:49.680362383 +0000 UTC m=+1236.749953624" Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.701187 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.951617593 podStartE2EDuration="5.70116975s" podCreationTimestamp="2025-10-10 07:11:44 +0000 UTC" firstStartedPulling="2025-10-10 07:11:45.990266319 +0000 UTC m=+1233.059857560" lastFinishedPulling="2025-10-10 07:11:48.739818476 +0000 UTC m=+1235.809409717" observedRunningTime="2025-10-10 07:11:49.691235419 +0000 UTC m=+1236.760826680" watchObservedRunningTime="2025-10-10 07:11:49.70116975 +0000 UTC m=+1236.770760991" Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.710576 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.710622 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" event={"ID":"a634b425-c334-4ffb-9ea0-f8deab7c9b00","Type":"ContainerStarted","Data":"b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297"} Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.719673 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" podStartSLOduration=4.719656444 podStartE2EDuration="4.719656444s" podCreationTimestamp="2025-10-10 07:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:49.718465162 +0000 UTC m=+1236.788056423" watchObservedRunningTime="2025-10-10 07:11:49.719656444 +0000 UTC m=+1236.789247685" Oct 10 07:11:49 crc kubenswrapper[4732]: I1010 07:11:49.739959 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.984189941 podStartE2EDuration="5.739940057s" podCreationTimestamp="2025-10-10 07:11:44 +0000 UTC" firstStartedPulling="2025-10-10 07:11:45.980439491 +0000 UTC m=+1233.050030732" lastFinishedPulling="2025-10-10 07:11:48.736189607 +0000 UTC m=+1235.805780848" observedRunningTime="2025-10-10 07:11:49.734970722 +0000 UTC m=+1236.804561963" watchObservedRunningTime="2025-10-10 07:11:49.739940057 +0000 UTC m=+1236.809531298" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.224397 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.256934 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.256980 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.405245 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.575290 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbx8h\" (UniqueName: \"kubernetes.io/projected/014559c5-0f9e-40f2-9970-8aa80bace48b-kube-api-access-gbx8h\") pod \"014559c5-0f9e-40f2-9970-8aa80bace48b\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.575328 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014559c5-0f9e-40f2-9970-8aa80bace48b-logs\") pod \"014559c5-0f9e-40f2-9970-8aa80bace48b\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.575495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-config-data\") pod \"014559c5-0f9e-40f2-9970-8aa80bace48b\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.575541 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-combined-ca-bundle\") pod \"014559c5-0f9e-40f2-9970-8aa80bace48b\" (UID: \"014559c5-0f9e-40f2-9970-8aa80bace48b\") " Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.576097 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/014559c5-0f9e-40f2-9970-8aa80bace48b-logs" (OuterVolumeSpecName: "logs") pod "014559c5-0f9e-40f2-9970-8aa80bace48b" (UID: "014559c5-0f9e-40f2-9970-8aa80bace48b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.582727 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014559c5-0f9e-40f2-9970-8aa80bace48b-kube-api-access-gbx8h" (OuterVolumeSpecName: "kube-api-access-gbx8h") pod "014559c5-0f9e-40f2-9970-8aa80bace48b" (UID: "014559c5-0f9e-40f2-9970-8aa80bace48b"). InnerVolumeSpecName "kube-api-access-gbx8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.600627 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-config-data" (OuterVolumeSpecName: "config-data") pod "014559c5-0f9e-40f2-9970-8aa80bace48b" (UID: "014559c5-0f9e-40f2-9970-8aa80bace48b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.603118 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "014559c5-0f9e-40f2-9970-8aa80bace48b" (UID: "014559c5-0f9e-40f2-9970-8aa80bace48b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.677328 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.677360 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/014559c5-0f9e-40f2-9970-8aa80bace48b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.677375 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbx8h\" (UniqueName: \"kubernetes.io/projected/014559c5-0f9e-40f2-9970-8aa80bace48b-kube-api-access-gbx8h\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.677389 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/014559c5-0f9e-40f2-9970-8aa80bace48b-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.678014 4732 generic.go:334] "Generic (PLEG): container finished" podID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerID="1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9" exitCode=0 Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.678036 4732 generic.go:334] "Generic (PLEG): container finished" podID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerID="db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f" exitCode=143 Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.678076 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"014559c5-0f9e-40f2-9970-8aa80bace48b","Type":"ContainerDied","Data":"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9"} Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.678105 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"014559c5-0f9e-40f2-9970-8aa80bace48b","Type":"ContainerDied","Data":"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f"} Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.678118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"014559c5-0f9e-40f2-9970-8aa80bace48b","Type":"ContainerDied","Data":"2af16c11196e6e0fe1c1f9da54304793744a7162b255fafcc152ea346d2b2ac9"} Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.678136 4732 scope.go:117] "RemoveContainer" containerID="1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.678264 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.685659 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7bf1479-c766-4541-8b5d-38b98d8929b7","Type":"ContainerStarted","Data":"743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e"} Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.685944 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e7bf1479-c766-4541-8b5d-38b98d8929b7" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e" gracePeriod=30 Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.706097 4732 scope.go:117] "RemoveContainer" containerID="db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.711025 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.940319246 podStartE2EDuration="6.711003037s" podCreationTimestamp="2025-10-10 07:11:44 +0000 UTC" firstStartedPulling="2025-10-10 07:11:46.367580078 +0000 UTC m=+1233.437171309" lastFinishedPulling="2025-10-10 07:11:50.138263869 +0000 UTC m=+1237.207855100" observedRunningTime="2025-10-10 07:11:50.704431348 +0000 UTC m=+1237.774022629" watchObservedRunningTime="2025-10-10 07:11:50.711003037 +0000 UTC m=+1237.780594288" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.734343 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.741194 4732 scope.go:117] "RemoveContainer" containerID="1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9" Oct 10 07:11:50 crc kubenswrapper[4732]: E1010 07:11:50.742615 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9\": container with ID starting with 1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9 not found: ID does not exist" containerID="1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.742653 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9"} err="failed to get container status \"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9\": rpc error: code = NotFound desc = could not find container \"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9\": container with ID starting with 1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9 not found: ID does not exist" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.742680 4732 scope.go:117] "RemoveContainer" containerID="db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f" Oct 10 07:11:50 crc kubenswrapper[4732]: E1010 07:11:50.742981 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f\": container with ID starting with db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f not found: ID does not exist" containerID="db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.743010 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f"} err="failed to get container status \"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f\": rpc error: code = NotFound desc = could not find container \"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f\": container with ID starting with db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f not found: ID does not exist" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.743031 4732 scope.go:117] "RemoveContainer" containerID="1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.746506 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9"} err="failed to get container status \"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9\": rpc error: code = NotFound desc = could not find container \"1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9\": container with ID starting with 1890dcb646878205d5cc2b1d1ccd8c4dd974c0adc7e899492a460862651156a9 not found: ID does not exist" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.746548 4732 scope.go:117] "RemoveContainer" containerID="db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.747109 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f"} err="failed to get container status \"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f\": rpc error: code = NotFound desc = could not find container \"db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f\": container with ID starting with db61a3df702094a4ba604753062919542e843b46fa88abdc2f6d8ea5f009487f not found: ID does not exist" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.749776 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.758306 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:50 crc kubenswrapper[4732]: E1010 07:11:50.758814 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-log" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.758828 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-log" Oct 10 07:11:50 crc kubenswrapper[4732]: E1010 07:11:50.758839 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-metadata" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.758845 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-metadata" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.759027 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-metadata" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.759040 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" containerName="nova-metadata-log" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.759925 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.762364 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.763445 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.795897 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.880905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.881175 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.881330 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x272n\" (UniqueName: \"kubernetes.io/projected/ab751a16-bca4-4b6c-ae02-00476f5f2098-kube-api-access-x272n\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.881431 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-config-data\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.881525 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab751a16-bca4-4b6c-ae02-00476f5f2098-logs\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.983847 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x272n\" (UniqueName: \"kubernetes.io/projected/ab751a16-bca4-4b6c-ae02-00476f5f2098-kube-api-access-x272n\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.984254 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-config-data\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.984295 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab751a16-bca4-4b6c-ae02-00476f5f2098-logs\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.984502 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.984563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.984899 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab751a16-bca4-4b6c-ae02-00476f5f2098-logs\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.988533 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.989440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-config-data\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:50 crc kubenswrapper[4732]: I1010 07:11:50.989522 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:51 crc kubenswrapper[4732]: I1010 07:11:51.007613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x272n\" (UniqueName: \"kubernetes.io/projected/ab751a16-bca4-4b6c-ae02-00476f5f2098-kube-api-access-x272n\") pod \"nova-metadata-0\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " pod="openstack/nova-metadata-0" Oct 10 07:11:51 crc kubenswrapper[4732]: I1010 07:11:51.168869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:51 crc kubenswrapper[4732]: I1010 07:11:51.694265 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014559c5-0f9e-40f2-9970-8aa80bace48b" path="/var/lib/kubelet/pods/014559c5-0f9e-40f2-9970-8aa80bace48b/volumes" Oct 10 07:11:51 crc kubenswrapper[4732]: I1010 07:11:51.695515 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:51 crc kubenswrapper[4732]: I1010 07:11:51.703071 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab751a16-bca4-4b6c-ae02-00476f5f2098","Type":"ContainerStarted","Data":"8cb848ae27a973a7afe53b7b6be4e50f0d3aaf96dbfb5fd4caac8654faa514da"} Oct 10 07:11:52 crc kubenswrapper[4732]: I1010 07:11:52.733866 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab751a16-bca4-4b6c-ae02-00476f5f2098","Type":"ContainerStarted","Data":"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94"} Oct 10 07:11:52 crc kubenswrapper[4732]: I1010 07:11:52.734257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab751a16-bca4-4b6c-ae02-00476f5f2098","Type":"ContainerStarted","Data":"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc"} Oct 10 07:11:52 crc kubenswrapper[4732]: I1010 07:11:52.756770 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.756748581 podStartE2EDuration="2.756748581s" podCreationTimestamp="2025-10-10 07:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:52.751253251 +0000 UTC m=+1239.820844512" watchObservedRunningTime="2025-10-10 07:11:52.756748581 +0000 UTC m=+1239.826339822" Oct 10 07:11:53 crc kubenswrapper[4732]: I1010 07:11:53.744764 4732 generic.go:334] "Generic (PLEG): container finished" podID="95d185c7-30e1-4efc-acd8-5ce5b3784a47" containerID="04b3bf834ce0e2c1ec7e80b92fd88bfde1cce05f5dc166ae8556a77592cda914" exitCode=0 Oct 10 07:11:53 crc kubenswrapper[4732]: I1010 07:11:53.744849 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vn6nz" event={"ID":"95d185c7-30e1-4efc-acd8-5ce5b3784a47","Type":"ContainerDied","Data":"04b3bf834ce0e2c1ec7e80b92fd88bfde1cce05f5dc166ae8556a77592cda914"} Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.121715 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.211821 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.211877 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.227665 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.261071 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.277560 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-config-data\") pod \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.277925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbczf\" (UniqueName: \"kubernetes.io/projected/95d185c7-30e1-4efc-acd8-5ce5b3784a47-kube-api-access-tbczf\") pod \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.278034 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-combined-ca-bundle\") pod \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.278180 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-scripts\") pod \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\" (UID: \"95d185c7-30e1-4efc-acd8-5ce5b3784a47\") " Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.284802 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-scripts" (OuterVolumeSpecName: "scripts") pod "95d185c7-30e1-4efc-acd8-5ce5b3784a47" (UID: "95d185c7-30e1-4efc-acd8-5ce5b3784a47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.286221 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d185c7-30e1-4efc-acd8-5ce5b3784a47-kube-api-access-tbczf" (OuterVolumeSpecName: "kube-api-access-tbczf") pod "95d185c7-30e1-4efc-acd8-5ce5b3784a47" (UID: "95d185c7-30e1-4efc-acd8-5ce5b3784a47"). InnerVolumeSpecName "kube-api-access-tbczf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.316880 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-config-data" (OuterVolumeSpecName: "config-data") pod "95d185c7-30e1-4efc-acd8-5ce5b3784a47" (UID: "95d185c7-30e1-4efc-acd8-5ce5b3784a47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.327862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95d185c7-30e1-4efc-acd8-5ce5b3784a47" (UID: "95d185c7-30e1-4efc-acd8-5ce5b3784a47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.355608 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.355657 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.380179 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.380283 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbczf\" (UniqueName: \"kubernetes.io/projected/95d185c7-30e1-4efc-acd8-5ce5b3784a47-kube-api-access-tbczf\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.380296 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.380305 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95d185c7-30e1-4efc-acd8-5ce5b3784a47-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.481223 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.499602 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.579444 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd7dbbffc-c2s4k"] Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.580054 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" podUID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerName="dnsmasq-dns" containerID="cri-o://a9f1b48810e30ab0b71246cf0f089634d34c6dc36d182577ed0def84fd0ff702" gracePeriod=10 Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.773502 4732 generic.go:334] "Generic (PLEG): container finished" podID="e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" containerID="37cc9165f0d1920ae33dae00b95b24914cdd3fbd5dce1ff153580c83b03a2d77" exitCode=0 Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.773600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c29q6" event={"ID":"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6","Type":"ContainerDied","Data":"37cc9165f0d1920ae33dae00b95b24914cdd3fbd5dce1ff153580c83b03a2d77"} Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.781022 4732 generic.go:334] "Generic (PLEG): container finished" podID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerID="a9f1b48810e30ab0b71246cf0f089634d34c6dc36d182577ed0def84fd0ff702" exitCode=0 Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.781093 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" event={"ID":"39f93b6c-c094-4275-ade1-5b4b2d95143c","Type":"ContainerDied","Data":"a9f1b48810e30ab0b71246cf0f089634d34c6dc36d182577ed0def84fd0ff702"} Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.783715 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vn6nz" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.783957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vn6nz" event={"ID":"95d185c7-30e1-4efc-acd8-5ce5b3784a47","Type":"ContainerDied","Data":"c3f157520402bcbd0f4b0216402b279edd621ebf3d2554957745d188a90e673e"} Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.783993 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3f157520402bcbd0f4b0216402b279edd621ebf3d2554957745d188a90e673e" Oct 10 07:11:55 crc kubenswrapper[4732]: I1010 07:11:55.822841 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.025033 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.025272 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-log" containerID="cri-o://71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb" gracePeriod=30 Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.025848 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-api" containerID="cri-o://517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e" gracePeriod=30 Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.034953 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": EOF" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.034979 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.182:8774/\": EOF" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.046495 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.047981 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-log" containerID="cri-o://aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc" gracePeriod=30 Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.050220 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-metadata" containerID="cri-o://ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94" gracePeriod=30 Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.169135 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.169196 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.288045 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67321190_8765_49cf_98b9_3dbd3c6b45cb.slice/crio-71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab751a16_bca4_4b6c_ae02_00476f5f2098.slice/crio-conmon-aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.360366 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.360831 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.504764 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-sb\") pod \"39f93b6c-c094-4275-ade1-5b4b2d95143c\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.504851 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-svc\") pod \"39f93b6c-c094-4275-ade1-5b4b2d95143c\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.504953 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54wpv\" (UniqueName: \"kubernetes.io/projected/39f93b6c-c094-4275-ade1-5b4b2d95143c-kube-api-access-54wpv\") pod \"39f93b6c-c094-4275-ade1-5b4b2d95143c\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.505022 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-nb\") pod \"39f93b6c-c094-4275-ade1-5b4b2d95143c\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.505075 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-config\") pod \"39f93b6c-c094-4275-ade1-5b4b2d95143c\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.505134 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-swift-storage-0\") pod \"39f93b6c-c094-4275-ade1-5b4b2d95143c\" (UID: \"39f93b6c-c094-4275-ade1-5b4b2d95143c\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.513884 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f93b6c-c094-4275-ade1-5b4b2d95143c-kube-api-access-54wpv" (OuterVolumeSpecName: "kube-api-access-54wpv") pod "39f93b6c-c094-4275-ade1-5b4b2d95143c" (UID: "39f93b6c-c094-4275-ade1-5b4b2d95143c"). InnerVolumeSpecName "kube-api-access-54wpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.563452 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-config" (OuterVolumeSpecName: "config") pod "39f93b6c-c094-4275-ade1-5b4b2d95143c" (UID: "39f93b6c-c094-4275-ade1-5b4b2d95143c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.563673 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39f93b6c-c094-4275-ade1-5b4b2d95143c" (UID: "39f93b6c-c094-4275-ade1-5b4b2d95143c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.569126 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39f93b6c-c094-4275-ade1-5b4b2d95143c" (UID: "39f93b6c-c094-4275-ade1-5b4b2d95143c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.574267 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.575382 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39f93b6c-c094-4275-ade1-5b4b2d95143c" (UID: "39f93b6c-c094-4275-ade1-5b4b2d95143c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.586396 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39f93b6c-c094-4275-ade1-5b4b2d95143c" (UID: "39f93b6c-c094-4275-ade1-5b4b2d95143c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.608858 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.608901 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.608919 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.608941 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.608955 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54wpv\" (UniqueName: \"kubernetes.io/projected/39f93b6c-c094-4275-ade1-5b4b2d95143c-kube-api-access-54wpv\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.608968 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39f93b6c-c094-4275-ade1-5b4b2d95143c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.709941 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-nova-metadata-tls-certs\") pod \"ab751a16-bca4-4b6c-ae02-00476f5f2098\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.710067 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-combined-ca-bundle\") pod \"ab751a16-bca4-4b6c-ae02-00476f5f2098\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.710129 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x272n\" (UniqueName: \"kubernetes.io/projected/ab751a16-bca4-4b6c-ae02-00476f5f2098-kube-api-access-x272n\") pod \"ab751a16-bca4-4b6c-ae02-00476f5f2098\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.710168 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab751a16-bca4-4b6c-ae02-00476f5f2098-logs\") pod \"ab751a16-bca4-4b6c-ae02-00476f5f2098\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.710241 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-config-data\") pod \"ab751a16-bca4-4b6c-ae02-00476f5f2098\" (UID: \"ab751a16-bca4-4b6c-ae02-00476f5f2098\") " Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.710546 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab751a16-bca4-4b6c-ae02-00476f5f2098-logs" (OuterVolumeSpecName: "logs") pod "ab751a16-bca4-4b6c-ae02-00476f5f2098" (UID: "ab751a16-bca4-4b6c-ae02-00476f5f2098"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.710737 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab751a16-bca4-4b6c-ae02-00476f5f2098-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.715611 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab751a16-bca4-4b6c-ae02-00476f5f2098-kube-api-access-x272n" (OuterVolumeSpecName: "kube-api-access-x272n") pod "ab751a16-bca4-4b6c-ae02-00476f5f2098" (UID: "ab751a16-bca4-4b6c-ae02-00476f5f2098"). InnerVolumeSpecName "kube-api-access-x272n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.741169 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab751a16-bca4-4b6c-ae02-00476f5f2098" (UID: "ab751a16-bca4-4b6c-ae02-00476f5f2098"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.742362 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-config-data" (OuterVolumeSpecName: "config-data") pod "ab751a16-bca4-4b6c-ae02-00476f5f2098" (UID: "ab751a16-bca4-4b6c-ae02-00476f5f2098"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.772301 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ab751a16-bca4-4b6c-ae02-00476f5f2098" (UID: "ab751a16-bca4-4b6c-ae02-00476f5f2098"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.792160 4732 generic.go:334] "Generic (PLEG): container finished" podID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerID="71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb" exitCode=143 Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.792215 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67321190-8765-49cf-98b9-3dbd3c6b45cb","Type":"ContainerDied","Data":"71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb"} Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.794014 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" event={"ID":"39f93b6c-c094-4275-ade1-5b4b2d95143c","Type":"ContainerDied","Data":"2cf91532bf467b39018801e8c159865dd55d88b20075e91c3e43538c5f572120"} Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.794053 4732 scope.go:117] "RemoveContainer" containerID="a9f1b48810e30ab0b71246cf0f089634d34c6dc36d182577ed0def84fd0ff702" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.794211 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd7dbbffc-c2s4k" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.809562 4732 generic.go:334] "Generic (PLEG): container finished" podID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerID="ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94" exitCode=0 Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.809586 4732 generic.go:334] "Generic (PLEG): container finished" podID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerID="aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc" exitCode=143 Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.810397 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab751a16-bca4-4b6c-ae02-00476f5f2098","Type":"ContainerDied","Data":"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94"} Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.810540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab751a16-bca4-4b6c-ae02-00476f5f2098","Type":"ContainerDied","Data":"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc"} Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.810639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab751a16-bca4-4b6c-ae02-00476f5f2098","Type":"ContainerDied","Data":"8cb848ae27a973a7afe53b7b6be4e50f0d3aaf96dbfb5fd4caac8654faa514da"} Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.810599 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.812411 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.813579 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x272n\" (UniqueName: \"kubernetes.io/projected/ab751a16-bca4-4b6c-ae02-00476f5f2098-kube-api-access-x272n\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.813594 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.813619 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab751a16-bca4-4b6c-ae02-00476f5f2098-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.840106 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fd7dbbffc-c2s4k"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.842430 4732 scope.go:117] "RemoveContainer" containerID="9f497b665fa9e42cf7693108cecd316d9307d9fd4c3d03a5df7b8a4863f3c4a6" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.855271 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fd7dbbffc-c2s4k"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.869042 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.883313 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.897262 4732 scope.go:117] "RemoveContainer" containerID="ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.914498 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.915075 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d185c7-30e1-4efc-acd8-5ce5b3784a47" containerName="nova-manage" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915187 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d185c7-30e1-4efc-acd8-5ce5b3784a47" containerName="nova-manage" Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.915240 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerName="init" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915250 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerName="init" Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.915261 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-metadata" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915268 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-metadata" Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.915281 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerName="dnsmasq-dns" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915289 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerName="dnsmasq-dns" Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.915299 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-log" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915307 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-log" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915537 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f93b6c-c094-4275-ade1-5b4b2d95143c" containerName="dnsmasq-dns" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915553 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d185c7-30e1-4efc-acd8-5ce5b3784a47" containerName="nova-manage" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915566 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-metadata" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.915593 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" containerName="nova-metadata-log" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.917243 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.923287 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.925338 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.926087 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.950562 4732 scope.go:117] "RemoveContainer" containerID="aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.967994 4732 scope.go:117] "RemoveContainer" containerID="ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94" Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.972164 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94\": container with ID starting with ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94 not found: ID does not exist" containerID="ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.972239 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94"} err="failed to get container status \"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94\": rpc error: code = NotFound desc = could not find container \"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94\": container with ID starting with ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94 not found: ID does not exist" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.972294 4732 scope.go:117] "RemoveContainer" containerID="aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc" Oct 10 07:11:56 crc kubenswrapper[4732]: E1010 07:11:56.975542 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc\": container with ID starting with aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc not found: ID does not exist" containerID="aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.975586 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc"} err="failed to get container status \"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc\": rpc error: code = NotFound desc = could not find container \"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc\": container with ID starting with aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc not found: ID does not exist" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.975610 4732 scope.go:117] "RemoveContainer" containerID="ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.978714 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94"} err="failed to get container status \"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94\": rpc error: code = NotFound desc = could not find container \"ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94\": container with ID starting with ba10c2a344d2b66e68cf1cb18277884a15999ac24d8037b4a2983d3318429e94 not found: ID does not exist" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.978739 4732 scope.go:117] "RemoveContainer" containerID="aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc" Oct 10 07:11:56 crc kubenswrapper[4732]: I1010 07:11:56.984313 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc"} err="failed to get container status \"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc\": rpc error: code = NotFound desc = could not find container \"aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc\": container with ID starting with aa2f1c82032dd5a5849b8f68170f248cf8cd497ff17d80115a66139d61a0c2dc not found: ID does not exist" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.019113 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-config-data\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.019271 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.019398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a49f230a-83df-4640-92f3-da3da9cf3f1c-logs\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.019457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2p4x\" (UniqueName: \"kubernetes.io/projected/a49f230a-83df-4640-92f3-da3da9cf3f1c-kube-api-access-k2p4x\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.019632 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.121879 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a49f230a-83df-4640-92f3-da3da9cf3f1c-logs\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.121987 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2p4x\" (UniqueName: \"kubernetes.io/projected/a49f230a-83df-4640-92f3-da3da9cf3f1c-kube-api-access-k2p4x\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.122104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.122155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-config-data\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.122211 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.122588 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a49f230a-83df-4640-92f3-da3da9cf3f1c-logs\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.128089 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-config-data\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.128451 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.128741 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.140510 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2p4x\" (UniqueName: \"kubernetes.io/projected/a49f230a-83df-4640-92f3-da3da9cf3f1c-kube-api-access-k2p4x\") pod \"nova-metadata-0\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.217702 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.250740 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.326511 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm58p\" (UniqueName: \"kubernetes.io/projected/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-kube-api-access-hm58p\") pod \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.326562 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-config-data\") pod \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.326612 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-scripts\") pod \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.326995 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-combined-ca-bundle\") pod \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\" (UID: \"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6\") " Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.335344 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-kube-api-access-hm58p" (OuterVolumeSpecName: "kube-api-access-hm58p") pod "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" (UID: "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6"). InnerVolumeSpecName "kube-api-access-hm58p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.335637 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-scripts" (OuterVolumeSpecName: "scripts") pod "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" (UID: "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.370277 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" (UID: "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.405090 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-config-data" (OuterVolumeSpecName: "config-data") pod "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" (UID: "e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.429570 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.429603 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm58p\" (UniqueName: \"kubernetes.io/projected/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-kube-api-access-hm58p\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.429613 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.429622 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.671934 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f93b6c-c094-4275-ade1-5b4b2d95143c" path="/var/lib/kubelet/pods/39f93b6c-c094-4275-ade1-5b4b2d95143c/volumes" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.673174 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab751a16-bca4-4b6c-ae02-00476f5f2098" path="/var/lib/kubelet/pods/ab751a16-bca4-4b6c-ae02-00476f5f2098/volumes" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.815544 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:11:57 crc kubenswrapper[4732]: W1010 07:11:57.820497 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda49f230a_83df_4640_92f3_da3da9cf3f1c.slice/crio-923a65f253976a6232c9504bfdfc01f4fabd9088892f3125c77d5881018872af WatchSource:0}: Error finding container 923a65f253976a6232c9504bfdfc01f4fabd9088892f3125c77d5881018872af: Status 404 returned error can't find the container with id 923a65f253976a6232c9504bfdfc01f4fabd9088892f3125c77d5881018872af Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.826106 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-c29q6" event={"ID":"e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6","Type":"ContainerDied","Data":"dc8fc5c6aa6fa2d3687b7232a781eff61538f4871c9cd8e9836037bb7bd0fa04"} Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.826142 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc8fc5c6aa6fa2d3687b7232a781eff61538f4871c9cd8e9836037bb7bd0fa04" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.826188 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-c29q6" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.828197 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1663f81e-0a18-486e-bbe9-4bb332fd5af7" containerName="nova-scheduler-scheduler" containerID="cri-o://073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841" gracePeriod=30 Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.922891 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:11:57 crc kubenswrapper[4732]: E1010 07:11:57.923530 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" containerName="nova-cell1-conductor-db-sync" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.923552 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" containerName="nova-cell1-conductor-db-sync" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.923821 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" containerName="nova-cell1-conductor-db-sync" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.937141 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.942523 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 07:11:57 crc kubenswrapper[4732]: I1010 07:11:57.964785 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.041842 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.041961 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.042466 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rwpx\" (UniqueName: \"kubernetes.io/projected/8d80b654-a26e-46ea-84f4-264c3c883250-kube-api-access-5rwpx\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.145732 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.145856 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rwpx\" (UniqueName: \"kubernetes.io/projected/8d80b654-a26e-46ea-84f4-264c3c883250-kube-api-access-5rwpx\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.145996 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.149635 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.152745 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.168777 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rwpx\" (UniqueName: \"kubernetes.io/projected/8d80b654-a26e-46ea-84f4-264c3c883250-kube-api-access-5rwpx\") pod \"nova-cell1-conductor-0\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.325418 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.789513 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.840514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d80b654-a26e-46ea-84f4-264c3c883250","Type":"ContainerStarted","Data":"63a8a10c8dccd10c83706e956446a412ad4cc2b1be0170c40baf3133c175aad6"} Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.844057 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a49f230a-83df-4640-92f3-da3da9cf3f1c","Type":"ContainerStarted","Data":"86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603"} Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.844115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a49f230a-83df-4640-92f3-da3da9cf3f1c","Type":"ContainerStarted","Data":"86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80"} Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.844127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a49f230a-83df-4640-92f3-da3da9cf3f1c","Type":"ContainerStarted","Data":"923a65f253976a6232c9504bfdfc01f4fabd9088892f3125c77d5881018872af"} Oct 10 07:11:58 crc kubenswrapper[4732]: I1010 07:11:58.887675 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.887655503 podStartE2EDuration="2.887655503s" podCreationTimestamp="2025-10-10 07:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:58.88570496 +0000 UTC m=+1245.955296211" watchObservedRunningTime="2025-10-10 07:11:58.887655503 +0000 UTC m=+1245.957246744" Oct 10 07:11:59 crc kubenswrapper[4732]: I1010 07:11:59.862339 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d80b654-a26e-46ea-84f4-264c3c883250","Type":"ContainerStarted","Data":"da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8"} Oct 10 07:11:59 crc kubenswrapper[4732]: I1010 07:11:59.862670 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 10 07:11:59 crc kubenswrapper[4732]: I1010 07:11:59.890807 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.890785796 podStartE2EDuration="2.890785796s" podCreationTimestamp="2025-10-10 07:11:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:11:59.88724724 +0000 UTC m=+1246.956838491" watchObservedRunningTime="2025-10-10 07:11:59.890785796 +0000 UTC m=+1246.960377037" Oct 10 07:12:00 crc kubenswrapper[4732]: E1010 07:12:00.226459 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:12:00 crc kubenswrapper[4732]: E1010 07:12:00.227650 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:12:00 crc kubenswrapper[4732]: E1010 07:12:00.229967 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:12:00 crc kubenswrapper[4732]: E1010 07:12:00.230004 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1663f81e-0a18-486e-bbe9-4bb332fd5af7" containerName="nova-scheduler-scheduler" Oct 10 07:12:00 crc kubenswrapper[4732]: I1010 07:12:00.874259 4732 generic.go:334] "Generic (PLEG): container finished" podID="1663f81e-0a18-486e-bbe9-4bb332fd5af7" containerID="073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841" exitCode=0 Oct 10 07:12:00 crc kubenswrapper[4732]: I1010 07:12:00.874459 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1663f81e-0a18-486e-bbe9-4bb332fd5af7","Type":"ContainerDied","Data":"073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841"} Oct 10 07:12:00 crc kubenswrapper[4732]: I1010 07:12:00.875429 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1663f81e-0a18-486e-bbe9-4bb332fd5af7","Type":"ContainerDied","Data":"ad61b65581198be2df4b45c74159704c5a185e22b3c6e33c83911014e131fd34"} Oct 10 07:12:00 crc kubenswrapper[4732]: I1010 07:12:00.875445 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad61b65581198be2df4b45c74159704c5a185e22b3c6e33c83911014e131fd34" Oct 10 07:12:00 crc kubenswrapper[4732]: I1010 07:12:00.905785 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.010167 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/1663f81e-0a18-486e-bbe9-4bb332fd5af7-kube-api-access-pbjck\") pod \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.010468 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-combined-ca-bundle\") pod \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.010533 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-config-data\") pod \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\" (UID: \"1663f81e-0a18-486e-bbe9-4bb332fd5af7\") " Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.015757 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1663f81e-0a18-486e-bbe9-4bb332fd5af7-kube-api-access-pbjck" (OuterVolumeSpecName: "kube-api-access-pbjck") pod "1663f81e-0a18-486e-bbe9-4bb332fd5af7" (UID: "1663f81e-0a18-486e-bbe9-4bb332fd5af7"). InnerVolumeSpecName "kube-api-access-pbjck". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.043895 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1663f81e-0a18-486e-bbe9-4bb332fd5af7" (UID: "1663f81e-0a18-486e-bbe9-4bb332fd5af7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.050527 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-config-data" (OuterVolumeSpecName: "config-data") pod "1663f81e-0a18-486e-bbe9-4bb332fd5af7" (UID: "1663f81e-0a18-486e-bbe9-4bb332fd5af7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.113144 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbjck\" (UniqueName: \"kubernetes.io/projected/1663f81e-0a18-486e-bbe9-4bb332fd5af7-kube-api-access-pbjck\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.113178 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.113187 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1663f81e-0a18-486e-bbe9-4bb332fd5af7-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.891007 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.911152 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.923242 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.942793 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:01 crc kubenswrapper[4732]: E1010 07:12:01.943277 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1663f81e-0a18-486e-bbe9-4bb332fd5af7" containerName="nova-scheduler-scheduler" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.943300 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1663f81e-0a18-486e-bbe9-4bb332fd5af7" containerName="nova-scheduler-scheduler" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.943480 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1663f81e-0a18-486e-bbe9-4bb332fd5af7" containerName="nova-scheduler-scheduler" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.944138 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.946055 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 07:12:01 crc kubenswrapper[4732]: I1010 07:12:01.952387 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.027609 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dgks\" (UniqueName: \"kubernetes.io/projected/fb7a871c-04b4-491b-a767-b01a2b3b38cf-kube-api-access-9dgks\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.028064 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-config-data\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.028118 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.130188 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dgks\" (UniqueName: \"kubernetes.io/projected/fb7a871c-04b4-491b-a767-b01a2b3b38cf-kube-api-access-9dgks\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.130313 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-config-data\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.130351 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.134139 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.134234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-config-data\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.152501 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dgks\" (UniqueName: \"kubernetes.io/projected/fb7a871c-04b4-491b-a767-b01a2b3b38cf-kube-api-access-9dgks\") pod \"nova-scheduler-0\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.251619 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.251974 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.269516 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.754968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:02 crc kubenswrapper[4732]: W1010 07:12:02.758882 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb7a871c_04b4_491b_a767_b01a2b3b38cf.slice/crio-15e204034583164a598297706a7c88e6d819d089fbf7393124fcbc4530c8e349 WatchSource:0}: Error finding container 15e204034583164a598297706a7c88e6d819d089fbf7393124fcbc4530c8e349: Status 404 returned error can't find the container with id 15e204034583164a598297706a7c88e6d819d089fbf7393124fcbc4530c8e349 Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.856355 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.903862 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7a871c-04b4-491b-a767-b01a2b3b38cf","Type":"ContainerStarted","Data":"15e204034583164a598297706a7c88e6d819d089fbf7393124fcbc4530c8e349"} Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.906246 4732 generic.go:334] "Generic (PLEG): container finished" podID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerID="517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e" exitCode=0 Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.906283 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67321190-8765-49cf-98b9-3dbd3c6b45cb","Type":"ContainerDied","Data":"517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e"} Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.906310 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67321190-8765-49cf-98b9-3dbd3c6b45cb","Type":"ContainerDied","Data":"1c81f151660b29eb0b011ed8ae699aeed979ac4daf2d7b28ed76c9f55f4bddd9"} Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.906336 4732 scope.go:117] "RemoveContainer" containerID="517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.906501 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.933994 4732 scope.go:117] "RemoveContainer" containerID="71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.951591 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67321190-8765-49cf-98b9-3dbd3c6b45cb-logs\") pod \"67321190-8765-49cf-98b9-3dbd3c6b45cb\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.951756 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-config-data\") pod \"67321190-8765-49cf-98b9-3dbd3c6b45cb\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.951901 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-combined-ca-bundle\") pod \"67321190-8765-49cf-98b9-3dbd3c6b45cb\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.951962 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c27fl\" (UniqueName: \"kubernetes.io/projected/67321190-8765-49cf-98b9-3dbd3c6b45cb-kube-api-access-c27fl\") pod \"67321190-8765-49cf-98b9-3dbd3c6b45cb\" (UID: \"67321190-8765-49cf-98b9-3dbd3c6b45cb\") " Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.955739 4732 scope.go:117] "RemoveContainer" containerID="517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.955835 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67321190-8765-49cf-98b9-3dbd3c6b45cb-logs" (OuterVolumeSpecName: "logs") pod "67321190-8765-49cf-98b9-3dbd3c6b45cb" (UID: "67321190-8765-49cf-98b9-3dbd3c6b45cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.959710 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67321190-8765-49cf-98b9-3dbd3c6b45cb-kube-api-access-c27fl" (OuterVolumeSpecName: "kube-api-access-c27fl") pod "67321190-8765-49cf-98b9-3dbd3c6b45cb" (UID: "67321190-8765-49cf-98b9-3dbd3c6b45cb"). InnerVolumeSpecName "kube-api-access-c27fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:02 crc kubenswrapper[4732]: E1010 07:12:02.962680 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e\": container with ID starting with 517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e not found: ID does not exist" containerID="517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.963025 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e"} err="failed to get container status \"517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e\": rpc error: code = NotFound desc = could not find container \"517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e\": container with ID starting with 517418b7a9ab20a0f48f1f13fe57ef080844aea08f31dcf0e89ddf3d74b8452e not found: ID does not exist" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.963113 4732 scope.go:117] "RemoveContainer" containerID="71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb" Oct 10 07:12:02 crc kubenswrapper[4732]: E1010 07:12:02.963507 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb\": container with ID starting with 71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb not found: ID does not exist" containerID="71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.963567 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb"} err="failed to get container status \"71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb\": rpc error: code = NotFound desc = could not find container \"71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb\": container with ID starting with 71ee54cd72cd5b8bd1561073b8a9f844854c37f9d0420eec92321545202319bb not found: ID does not exist" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.982443 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c27fl\" (UniqueName: \"kubernetes.io/projected/67321190-8765-49cf-98b9-3dbd3c6b45cb-kube-api-access-c27fl\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.982528 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67321190-8765-49cf-98b9-3dbd3c6b45cb-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:02 crc kubenswrapper[4732]: I1010 07:12:02.995449 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-config-data" (OuterVolumeSpecName: "config-data") pod "67321190-8765-49cf-98b9-3dbd3c6b45cb" (UID: "67321190-8765-49cf-98b9-3dbd3c6b45cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.003446 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67321190-8765-49cf-98b9-3dbd3c6b45cb" (UID: "67321190-8765-49cf-98b9-3dbd3c6b45cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.084493 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.084537 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67321190-8765-49cf-98b9-3dbd3c6b45cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.242657 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.251759 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.273571 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:03 crc kubenswrapper[4732]: E1010 07:12:03.274204 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-api" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.274233 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-api" Oct 10 07:12:03 crc kubenswrapper[4732]: E1010 07:12:03.274285 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-log" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.274295 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-log" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.274484 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-api" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.274515 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" containerName="nova-api-log" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.276079 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.279772 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.286662 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.375561 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.396983 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzz2\" (UniqueName: \"kubernetes.io/projected/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-kube-api-access-cnzz2\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.397719 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-logs\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.398214 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.400019 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-config-data\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.501845 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-config-data\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.501918 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnzz2\" (UniqueName: \"kubernetes.io/projected/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-kube-api-access-cnzz2\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.502000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-logs\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.502085 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.502445 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-logs\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.506178 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.506755 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-config-data\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.523133 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnzz2\" (UniqueName: \"kubernetes.io/projected/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-kube-api-access-cnzz2\") pod \"nova-api-0\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.595339 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.673913 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1663f81e-0a18-486e-bbe9-4bb332fd5af7" path="/var/lib/kubelet/pods/1663f81e-0a18-486e-bbe9-4bb332fd5af7/volumes" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.677814 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67321190-8765-49cf-98b9-3dbd3c6b45cb" path="/var/lib/kubelet/pods/67321190-8765-49cf-98b9-3dbd3c6b45cb/volumes" Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.921842 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7a871c-04b4-491b-a767-b01a2b3b38cf","Type":"ContainerStarted","Data":"21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926"} Oct 10 07:12:03 crc kubenswrapper[4732]: I1010 07:12:03.939581 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.939559641 podStartE2EDuration="2.939559641s" podCreationTimestamp="2025-10-10 07:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:03.936351814 +0000 UTC m=+1251.005943055" watchObservedRunningTime="2025-10-10 07:12:03.939559641 +0000 UTC m=+1251.009150882" Oct 10 07:12:04 crc kubenswrapper[4732]: I1010 07:12:04.073355 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:04 crc kubenswrapper[4732]: I1010 07:12:04.934268 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54e3ee56-9d7c-4069-aa57-a5f4ed41c615","Type":"ContainerStarted","Data":"4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff"} Oct 10 07:12:04 crc kubenswrapper[4732]: I1010 07:12:04.934728 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54e3ee56-9d7c-4069-aa57-a5f4ed41c615","Type":"ContainerStarted","Data":"31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a"} Oct 10 07:12:04 crc kubenswrapper[4732]: I1010 07:12:04.934777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54e3ee56-9d7c-4069-aa57-a5f4ed41c615","Type":"ContainerStarted","Data":"b4e61288c6f66a6e4f25e4589e2222d66559c7a1de9ce7d732ffa94b465cceda"} Oct 10 07:12:04 crc kubenswrapper[4732]: I1010 07:12:04.960513 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.960492371 podStartE2EDuration="1.960492371s" podCreationTimestamp="2025-10-10 07:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:04.953180731 +0000 UTC m=+1252.022772052" watchObservedRunningTime="2025-10-10 07:12:04.960492371 +0000 UTC m=+1252.030083612" Oct 10 07:12:06 crc kubenswrapper[4732]: I1010 07:12:06.195830 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 07:12:07 crc kubenswrapper[4732]: I1010 07:12:07.252412 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:12:07 crc kubenswrapper[4732]: I1010 07:12:07.253325 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:12:07 crc kubenswrapper[4732]: I1010 07:12:07.270810 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 07:12:08 crc kubenswrapper[4732]: I1010 07:12:08.265916 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:08 crc kubenswrapper[4732]: I1010 07:12:08.265916 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.054044 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.054300 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6bee7b7d-832a-4bd7-8efd-db27adf3664a" containerName="kube-state-metrics" containerID="cri-o://8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c" gracePeriod=30 Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.536297 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.548619 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rh89\" (UniqueName: \"kubernetes.io/projected/6bee7b7d-832a-4bd7-8efd-db27adf3664a-kube-api-access-6rh89\") pod \"6bee7b7d-832a-4bd7-8efd-db27adf3664a\" (UID: \"6bee7b7d-832a-4bd7-8efd-db27adf3664a\") " Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.557387 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bee7b7d-832a-4bd7-8efd-db27adf3664a-kube-api-access-6rh89" (OuterVolumeSpecName: "kube-api-access-6rh89") pod "6bee7b7d-832a-4bd7-8efd-db27adf3664a" (UID: "6bee7b7d-832a-4bd7-8efd-db27adf3664a"). InnerVolumeSpecName "kube-api-access-6rh89". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.652659 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rh89\" (UniqueName: \"kubernetes.io/projected/6bee7b7d-832a-4bd7-8efd-db27adf3664a-kube-api-access-6rh89\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.997215 4732 generic.go:334] "Generic (PLEG): container finished" podID="6bee7b7d-832a-4bd7-8efd-db27adf3664a" containerID="8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c" exitCode=2 Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.997257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6bee7b7d-832a-4bd7-8efd-db27adf3664a","Type":"ContainerDied","Data":"8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c"} Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.997284 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6bee7b7d-832a-4bd7-8efd-db27adf3664a","Type":"ContainerDied","Data":"4de17f72e4b640b82a8780dedae2d6bf6bd018bb2d4140b5229025e18125370c"} Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.997310 4732 scope.go:117] "RemoveContainer" containerID="8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c" Oct 10 07:12:10 crc kubenswrapper[4732]: I1010 07:12:10.997332 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.024557 4732 scope.go:117] "RemoveContainer" containerID="8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c" Oct 10 07:12:11 crc kubenswrapper[4732]: E1010 07:12:11.025295 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c\": container with ID starting with 8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c not found: ID does not exist" containerID="8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.025337 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c"} err="failed to get container status \"8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c\": rpc error: code = NotFound desc = could not find container \"8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c\": container with ID starting with 8690a0053d53bfdd5495863949b7adbafac28a9fe82c5b5ce62daac8893e565c not found: ID does not exist" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.050962 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.065743 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.076099 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:12:11 crc kubenswrapper[4732]: E1010 07:12:11.076455 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bee7b7d-832a-4bd7-8efd-db27adf3664a" containerName="kube-state-metrics" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.076476 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bee7b7d-832a-4bd7-8efd-db27adf3664a" containerName="kube-state-metrics" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.076686 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bee7b7d-832a-4bd7-8efd-db27adf3664a" containerName="kube-state-metrics" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.078351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.080063 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.080470 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.092201 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.162242 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.162362 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.162448 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.162578 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzsw6\" (UniqueName: \"kubernetes.io/projected/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-api-access-dzsw6\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.263480 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.263571 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzsw6\" (UniqueName: \"kubernetes.io/projected/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-api-access-dzsw6\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.263663 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.263723 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.267880 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.268336 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.279510 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.282601 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzsw6\" (UniqueName: \"kubernetes.io/projected/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-api-access-dzsw6\") pod \"kube-state-metrics-0\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.396110 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.673611 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bee7b7d-832a-4bd7-8efd-db27adf3664a" path="/var/lib/kubelet/pods/6bee7b7d-832a-4bd7-8efd-db27adf3664a/volumes" Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.829159 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.829415 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-central-agent" containerID="cri-o://f749aac1c3128e8bb73d2aea323ce1c264d3e4f6ebaa237a4a0da2e0f2f8d619" gracePeriod=30 Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.829480 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="proxy-httpd" containerID="cri-o://36ca83580adb755b0ec73cc7a25c4dd14e2a30895334d3561ec2e969937e858a" gracePeriod=30 Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.829487 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="sg-core" containerID="cri-o://f352ed4bc8699274befd0304977aa60b3e0b9612209d1bd2bfafdb028382f639" gracePeriod=30 Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.829552 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-notification-agent" containerID="cri-o://6ae7fb61d646953cb25b72f435163da67333a8e5be9470f635bfae752e17fbf3" gracePeriod=30 Oct 10 07:12:11 crc kubenswrapper[4732]: I1010 07:12:11.850317 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:12:12 crc kubenswrapper[4732]: I1010 07:12:12.011632 4732 generic.go:334] "Generic (PLEG): container finished" podID="c073656d-ced9-4557-b243-6436287d45f2" containerID="36ca83580adb755b0ec73cc7a25c4dd14e2a30895334d3561ec2e969937e858a" exitCode=0 Oct 10 07:12:12 crc kubenswrapper[4732]: I1010 07:12:12.011670 4732 generic.go:334] "Generic (PLEG): container finished" podID="c073656d-ced9-4557-b243-6436287d45f2" containerID="f352ed4bc8699274befd0304977aa60b3e0b9612209d1bd2bfafdb028382f639" exitCode=2 Oct 10 07:12:12 crc kubenswrapper[4732]: I1010 07:12:12.011710 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerDied","Data":"36ca83580adb755b0ec73cc7a25c4dd14e2a30895334d3561ec2e969937e858a"} Oct 10 07:12:12 crc kubenswrapper[4732]: I1010 07:12:12.011748 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerDied","Data":"f352ed4bc8699274befd0304977aa60b3e0b9612209d1bd2bfafdb028382f639"} Oct 10 07:12:12 crc kubenswrapper[4732]: I1010 07:12:12.014371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85","Type":"ContainerStarted","Data":"efce9cddac093e68f8121d0f1a22775f2c1cbc31b6718bc684d6abfec9b531f5"} Oct 10 07:12:12 crc kubenswrapper[4732]: I1010 07:12:12.270996 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 07:12:12 crc kubenswrapper[4732]: I1010 07:12:12.322770 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 07:12:13 crc kubenswrapper[4732]: I1010 07:12:13.032814 4732 generic.go:334] "Generic (PLEG): container finished" podID="c073656d-ced9-4557-b243-6436287d45f2" containerID="f749aac1c3128e8bb73d2aea323ce1c264d3e4f6ebaa237a4a0da2e0f2f8d619" exitCode=0 Oct 10 07:12:13 crc kubenswrapper[4732]: I1010 07:12:13.032919 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerDied","Data":"f749aac1c3128e8bb73d2aea323ce1c264d3e4f6ebaa237a4a0da2e0f2f8d619"} Oct 10 07:12:13 crc kubenswrapper[4732]: I1010 07:12:13.035787 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85","Type":"ContainerStarted","Data":"90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592"} Oct 10 07:12:13 crc kubenswrapper[4732]: I1010 07:12:13.071025 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.7243819120000001 podStartE2EDuration="2.071004484s" podCreationTimestamp="2025-10-10 07:12:11 +0000 UTC" firstStartedPulling="2025-10-10 07:12:11.853405691 +0000 UTC m=+1258.922996932" lastFinishedPulling="2025-10-10 07:12:12.200028263 +0000 UTC m=+1259.269619504" observedRunningTime="2025-10-10 07:12:13.05911849 +0000 UTC m=+1260.128709781" watchObservedRunningTime="2025-10-10 07:12:13.071004484 +0000 UTC m=+1260.140595745" Oct 10 07:12:13 crc kubenswrapper[4732]: I1010 07:12:13.088907 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 07:12:13 crc kubenswrapper[4732]: I1010 07:12:13.596057 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:12:13 crc kubenswrapper[4732]: I1010 07:12:13.596140 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:12:14 crc kubenswrapper[4732]: I1010 07:12:14.043969 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 10 07:12:14 crc kubenswrapper[4732]: I1010 07:12:14.596840 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:14 crc kubenswrapper[4732]: I1010 07:12:14.638898 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.058816 4732 generic.go:334] "Generic (PLEG): container finished" podID="c073656d-ced9-4557-b243-6436287d45f2" containerID="6ae7fb61d646953cb25b72f435163da67333a8e5be9470f635bfae752e17fbf3" exitCode=0 Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.058958 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerDied","Data":"6ae7fb61d646953cb25b72f435163da67333a8e5be9470f635bfae752e17fbf3"} Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.369793 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537329 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-log-httpd\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537375 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537416 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rx8v\" (UniqueName: \"kubernetes.io/projected/c073656d-ced9-4557-b243-6436287d45f2-kube-api-access-9rx8v\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537449 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-sg-core-conf-yaml\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537523 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-run-httpd\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537676 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-config-data\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537730 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-scripts\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.537888 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.538242 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.538568 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.547895 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c073656d-ced9-4557-b243-6436287d45f2-kube-api-access-9rx8v" (OuterVolumeSpecName: "kube-api-access-9rx8v") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2"). InnerVolumeSpecName "kube-api-access-9rx8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.548731 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-scripts" (OuterVolumeSpecName: "scripts") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.577681 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.640060 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.640091 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rx8v\" (UniqueName: \"kubernetes.io/projected/c073656d-ced9-4557-b243-6436287d45f2-kube-api-access-9rx8v\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.640102 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.640111 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c073656d-ced9-4557-b243-6436287d45f2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:15 crc kubenswrapper[4732]: E1010 07:12:15.647548 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle podName:c073656d-ced9-4557-b243-6436287d45f2 nodeName:}" failed. No retries permitted until 2025-10-10 07:12:16.147519401 +0000 UTC m=+1263.217110642 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2") : error deleting /var/lib/kubelet/pods/c073656d-ced9-4557-b243-6436287d45f2/volume-subpaths: remove /var/lib/kubelet/pods/c073656d-ced9-4557-b243-6436287d45f2/volume-subpaths: no such file or directory Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.650153 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-config-data" (OuterVolumeSpecName: "config-data") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:15 crc kubenswrapper[4732]: I1010 07:12:15.742128 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.071892 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c073656d-ced9-4557-b243-6436287d45f2","Type":"ContainerDied","Data":"81132a6f8574d0095bd62dd92ba9069617837d9297212f13a14e88b5b841ad0c"} Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.072332 4732 scope.go:117] "RemoveContainer" containerID="36ca83580adb755b0ec73cc7a25c4dd14e2a30895334d3561ec2e969937e858a" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.072569 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.096041 4732 scope.go:117] "RemoveContainer" containerID="f352ed4bc8699274befd0304977aa60b3e0b9612209d1bd2bfafdb028382f639" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.125440 4732 scope.go:117] "RemoveContainer" containerID="6ae7fb61d646953cb25b72f435163da67333a8e5be9470f635bfae752e17fbf3" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.147930 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle\") pod \"c073656d-ced9-4557-b243-6436287d45f2\" (UID: \"c073656d-ced9-4557-b243-6436287d45f2\") " Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.174169 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c073656d-ced9-4557-b243-6436287d45f2" (UID: "c073656d-ced9-4557-b243-6436287d45f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.186235 4732 scope.go:117] "RemoveContainer" containerID="f749aac1c3128e8bb73d2aea323ce1c264d3e4f6ebaa237a4a0da2e0f2f8d619" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.250884 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c073656d-ced9-4557-b243-6436287d45f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.423275 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.441034 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.464973 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:16 crc kubenswrapper[4732]: E1010 07:12:16.466048 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-notification-agent" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.466233 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-notification-agent" Oct 10 07:12:16 crc kubenswrapper[4732]: E1010 07:12:16.466385 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="proxy-httpd" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.466508 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="proxy-httpd" Oct 10 07:12:16 crc kubenswrapper[4732]: E1010 07:12:16.466646 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="sg-core" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.466810 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="sg-core" Oct 10 07:12:16 crc kubenswrapper[4732]: E1010 07:12:16.466958 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-central-agent" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.467087 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-central-agent" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.467540 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-central-agent" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.467770 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="ceilometer-notification-agent" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.467990 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="sg-core" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.468154 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c073656d-ced9-4557-b243-6436287d45f2" containerName="proxy-httpd" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.470929 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.475276 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.475564 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.476116 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.490556 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-scripts\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658445 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658606 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-log-httpd\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658735 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-run-httpd\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658816 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dqf\" (UniqueName: \"kubernetes.io/projected/3bd80454-e583-4f9b-87b6-378015d4016b-kube-api-access-c5dqf\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658842 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-config-data\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.658971 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760286 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760395 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-log-httpd\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760468 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-run-httpd\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760544 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dqf\" (UniqueName: \"kubernetes.io/projected/3bd80454-e583-4f9b-87b6-378015d4016b-kube-api-access-c5dqf\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-config-data\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760630 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760662 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-scripts\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.760858 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-run-httpd\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.762067 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-log-httpd\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.765836 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.766211 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.766631 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-config-data\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.767813 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-scripts\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.771107 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.779245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dqf\" (UniqueName: \"kubernetes.io/projected/3bd80454-e583-4f9b-87b6-378015d4016b-kube-api-access-c5dqf\") pod \"ceilometer-0\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " pod="openstack/ceilometer-0" Oct 10 07:12:16 crc kubenswrapper[4732]: I1010 07:12:16.798379 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:17 crc kubenswrapper[4732]: I1010 07:12:17.258752 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 07:12:17 crc kubenswrapper[4732]: I1010 07:12:17.262927 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 07:12:17 crc kubenswrapper[4732]: I1010 07:12:17.264158 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 07:12:17 crc kubenswrapper[4732]: I1010 07:12:17.301937 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:17 crc kubenswrapper[4732]: I1010 07:12:17.671377 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c073656d-ced9-4557-b243-6436287d45f2" path="/var/lib/kubelet/pods/c073656d-ced9-4557-b243-6436287d45f2/volumes" Oct 10 07:12:18 crc kubenswrapper[4732]: I1010 07:12:18.104502 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerStarted","Data":"4161e02bbc31833c31ec87f227788cb371f19e5b65bde35425435b4aadafb28e"} Oct 10 07:12:18 crc kubenswrapper[4732]: I1010 07:12:18.110774 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 07:12:19 crc kubenswrapper[4732]: I1010 07:12:19.116312 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerStarted","Data":"4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a"} Oct 10 07:12:19 crc kubenswrapper[4732]: I1010 07:12:19.116623 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerStarted","Data":"6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc"} Oct 10 07:12:20 crc kubenswrapper[4732]: I1010 07:12:20.130555 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerStarted","Data":"da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e"} Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.101879 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.152555 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerStarted","Data":"f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed"} Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.152646 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.154595 4732 generic.go:334] "Generic (PLEG): container finished" podID="e7bf1479-c766-4541-8b5d-38b98d8929b7" containerID="743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e" exitCode=137 Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.154638 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7bf1479-c766-4541-8b5d-38b98d8929b7","Type":"ContainerDied","Data":"743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e"} Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.154660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e7bf1479-c766-4541-8b5d-38b98d8929b7","Type":"ContainerDied","Data":"adadffc3791667c1f304b68e0d061a1502226cb6c35d542977770dc7c7a31881"} Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.154683 4732 scope.go:117] "RemoveContainer" containerID="743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.154914 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.183728 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.664597005 podStartE2EDuration="5.183712576s" podCreationTimestamp="2025-10-10 07:12:16 +0000 UTC" firstStartedPulling="2025-10-10 07:12:17.292363925 +0000 UTC m=+1264.361955166" lastFinishedPulling="2025-10-10 07:12:20.811479486 +0000 UTC m=+1267.881070737" observedRunningTime="2025-10-10 07:12:21.17724036 +0000 UTC m=+1268.246831621" watchObservedRunningTime="2025-10-10 07:12:21.183712576 +0000 UTC m=+1268.253303817" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.186938 4732 scope.go:117] "RemoveContainer" containerID="743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e" Oct 10 07:12:21 crc kubenswrapper[4732]: E1010 07:12:21.187419 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e\": container with ID starting with 743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e not found: ID does not exist" containerID="743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.187459 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e"} err="failed to get container status \"743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e\": rpc error: code = NotFound desc = could not find container \"743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e\": container with ID starting with 743524cc4267658657f33e7e47a1c9792f65b21956c96fc78b590bc950a6350e not found: ID does not exist" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.237779 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tprkn\" (UniqueName: \"kubernetes.io/projected/e7bf1479-c766-4541-8b5d-38b98d8929b7-kube-api-access-tprkn\") pod \"e7bf1479-c766-4541-8b5d-38b98d8929b7\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.237831 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-combined-ca-bundle\") pod \"e7bf1479-c766-4541-8b5d-38b98d8929b7\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.238065 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-config-data\") pod \"e7bf1479-c766-4541-8b5d-38b98d8929b7\" (UID: \"e7bf1479-c766-4541-8b5d-38b98d8929b7\") " Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.243373 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bf1479-c766-4541-8b5d-38b98d8929b7-kube-api-access-tprkn" (OuterVolumeSpecName: "kube-api-access-tprkn") pod "e7bf1479-c766-4541-8b5d-38b98d8929b7" (UID: "e7bf1479-c766-4541-8b5d-38b98d8929b7"). InnerVolumeSpecName "kube-api-access-tprkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.273680 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-config-data" (OuterVolumeSpecName: "config-data") pod "e7bf1479-c766-4541-8b5d-38b98d8929b7" (UID: "e7bf1479-c766-4541-8b5d-38b98d8929b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.273757 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7bf1479-c766-4541-8b5d-38b98d8929b7" (UID: "e7bf1479-c766-4541-8b5d-38b98d8929b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.341181 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tprkn\" (UniqueName: \"kubernetes.io/projected/e7bf1479-c766-4541-8b5d-38b98d8929b7-kube-api-access-tprkn\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.341242 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.341263 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7bf1479-c766-4541-8b5d-38b98d8929b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.407929 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.507442 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.521642 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.535454 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:12:21 crc kubenswrapper[4732]: E1010 07:12:21.535894 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bf1479-c766-4541-8b5d-38b98d8929b7" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.535911 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bf1479-c766-4541-8b5d-38b98d8929b7" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.536095 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bf1479-c766-4541-8b5d-38b98d8929b7" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.536721 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.541110 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.541277 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.541859 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.544525 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.544710 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.544877 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.544996 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfflx\" (UniqueName: \"kubernetes.io/projected/710f9fa6-588e-4226-a65d-5220d0a1f315-kube-api-access-lfflx\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.545123 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.547406 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.646370 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.646423 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfflx\" (UniqueName: \"kubernetes.io/projected/710f9fa6-588e-4226-a65d-5220d0a1f315-kube-api-access-lfflx\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.646460 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.646534 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.646569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.650514 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.650873 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.651332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.653988 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.663119 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfflx\" (UniqueName: \"kubernetes.io/projected/710f9fa6-588e-4226-a65d-5220d0a1f315-kube-api-access-lfflx\") pod \"nova-cell1-novncproxy-0\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.682354 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bf1479-c766-4541-8b5d-38b98d8929b7" path="/var/lib/kubelet/pods/e7bf1479-c766-4541-8b5d-38b98d8929b7/volumes" Oct 10 07:12:21 crc kubenswrapper[4732]: I1010 07:12:21.868668 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:22 crc kubenswrapper[4732]: I1010 07:12:22.375775 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:12:23 crc kubenswrapper[4732]: I1010 07:12:23.178730 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"710f9fa6-588e-4226-a65d-5220d0a1f315","Type":"ContainerStarted","Data":"f9506fceaa77699397e7b29b0e67d5a568de582fd92174a2332250afa9eed955"} Oct 10 07:12:23 crc kubenswrapper[4732]: I1010 07:12:23.179032 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"710f9fa6-588e-4226-a65d-5220d0a1f315","Type":"ContainerStarted","Data":"276b8a0dba902f244256d6f181af88820089bdaf2659e8b864712ff6faa77dbd"} Oct 10 07:12:23 crc kubenswrapper[4732]: I1010 07:12:23.210432 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.210396621 podStartE2EDuration="2.210396621s" podCreationTimestamp="2025-10-10 07:12:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:23.195792293 +0000 UTC m=+1270.265383604" watchObservedRunningTime="2025-10-10 07:12:23.210396621 +0000 UTC m=+1270.279987902" Oct 10 07:12:23 crc kubenswrapper[4732]: I1010 07:12:23.600208 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 07:12:23 crc kubenswrapper[4732]: I1010 07:12:23.601399 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 07:12:23 crc kubenswrapper[4732]: I1010 07:12:23.603954 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 07:12:23 crc kubenswrapper[4732]: I1010 07:12:23.605953 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.196525 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.201035 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.398812 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b59764b5c-95h4h"] Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.401184 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.418391 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b59764b5c-95h4h"] Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.512684 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-svc\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.512984 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.513035 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.513072 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-swift-storage-0\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.513118 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-config\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.513134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cbtv\" (UniqueName: \"kubernetes.io/projected/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-kube-api-access-2cbtv\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.615194 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-swift-storage-0\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.615304 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-config\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.615329 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cbtv\" (UniqueName: \"kubernetes.io/projected/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-kube-api-access-2cbtv\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.615389 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-svc\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.615438 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.615498 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.616564 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-nb\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.617246 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-swift-storage-0\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.617875 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-config\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.619246 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-sb\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.619793 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-svc\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.642765 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cbtv\" (UniqueName: \"kubernetes.io/projected/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-kube-api-access-2cbtv\") pod \"dnsmasq-dns-6b59764b5c-95h4h\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:24 crc kubenswrapper[4732]: I1010 07:12:24.742590 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:25 crc kubenswrapper[4732]: I1010 07:12:25.284322 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b59764b5c-95h4h"] Oct 10 07:12:25 crc kubenswrapper[4732]: I1010 07:12:25.358722 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:12:25 crc kubenswrapper[4732]: I1010 07:12:25.358803 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:12:26 crc kubenswrapper[4732]: I1010 07:12:26.213187 4732 generic.go:334] "Generic (PLEG): container finished" podID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerID="731da02a1e6751e1d7213579ef56876796ccc65a87796ddba50224b4c70cdce5" exitCode=0 Oct 10 07:12:26 crc kubenswrapper[4732]: I1010 07:12:26.213297 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" event={"ID":"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5","Type":"ContainerDied","Data":"731da02a1e6751e1d7213579ef56876796ccc65a87796ddba50224b4c70cdce5"} Oct 10 07:12:26 crc kubenswrapper[4732]: I1010 07:12:26.214565 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" event={"ID":"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5","Type":"ContainerStarted","Data":"71483a8745aabe0f9f451413ccd060c88d13aceaa337204ee9fa7029f31eee0b"} Oct 10 07:12:26 crc kubenswrapper[4732]: I1010 07:12:26.869704 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.081781 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.082405 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-central-agent" containerID="cri-o://6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc" gracePeriod=30 Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.082545 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="sg-core" containerID="cri-o://da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e" gracePeriod=30 Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.082657 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-notification-agent" containerID="cri-o://4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a" gracePeriod=30 Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.082574 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="proxy-httpd" containerID="cri-o://f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed" gracePeriod=30 Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.233100 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" event={"ID":"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5","Type":"ContainerStarted","Data":"6a5d33aa67dcba33d9cca4c7fdc0763242994866d812f7cf3aeebaaa727f6f35"} Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.234303 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.240043 4732 generic.go:334] "Generic (PLEG): container finished" podID="3bd80454-e583-4f9b-87b6-378015d4016b" containerID="f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed" exitCode=0 Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.240059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerDied","Data":"f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed"} Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.240074 4732 generic.go:334] "Generic (PLEG): container finished" podID="3bd80454-e583-4f9b-87b6-378015d4016b" containerID="da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e" exitCode=2 Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.240096 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerDied","Data":"da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e"} Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.271201 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" podStartSLOduration=3.271180703 podStartE2EDuration="3.271180703s" podCreationTimestamp="2025-10-10 07:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:27.266261708 +0000 UTC m=+1274.335852979" watchObservedRunningTime="2025-10-10 07:12:27.271180703 +0000 UTC m=+1274.340771944" Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.431783 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.432049 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-log" containerID="cri-o://31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a" gracePeriod=30 Oct 10 07:12:27 crc kubenswrapper[4732]: I1010 07:12:27.432081 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-api" containerID="cri-o://4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff" gracePeriod=30 Oct 10 07:12:28 crc kubenswrapper[4732]: I1010 07:12:28.250627 4732 generic.go:334] "Generic (PLEG): container finished" podID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerID="31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a" exitCode=143 Oct 10 07:12:28 crc kubenswrapper[4732]: I1010 07:12:28.250682 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54e3ee56-9d7c-4069-aa57-a5f4ed41c615","Type":"ContainerDied","Data":"31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a"} Oct 10 07:12:28 crc kubenswrapper[4732]: I1010 07:12:28.253142 4732 generic.go:334] "Generic (PLEG): container finished" podID="3bd80454-e583-4f9b-87b6-378015d4016b" containerID="6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc" exitCode=0 Oct 10 07:12:28 crc kubenswrapper[4732]: I1010 07:12:28.253218 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerDied","Data":"6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc"} Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.190090 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.266240 4732 generic.go:334] "Generic (PLEG): container finished" podID="3bd80454-e583-4f9b-87b6-378015d4016b" containerID="4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a" exitCode=0 Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.267350 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.267472 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerDied","Data":"4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a"} Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.267516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bd80454-e583-4f9b-87b6-378015d4016b","Type":"ContainerDied","Data":"4161e02bbc31833c31ec87f227788cb371f19e5b65bde35425435b4aadafb28e"} Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.267543 4732 scope.go:117] "RemoveContainer" containerID="f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.296108 4732 scope.go:117] "RemoveContainer" containerID="da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.304817 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-config-data\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.304870 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-sg-core-conf-yaml\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.304961 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-combined-ca-bundle\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.305023 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-ceilometer-tls-certs\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.305066 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5dqf\" (UniqueName: \"kubernetes.io/projected/3bd80454-e583-4f9b-87b6-378015d4016b-kube-api-access-c5dqf\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.305110 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-log-httpd\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.305138 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-scripts\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.305229 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-run-httpd\") pod \"3bd80454-e583-4f9b-87b6-378015d4016b\" (UID: \"3bd80454-e583-4f9b-87b6-378015d4016b\") " Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.306305 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.307480 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.315070 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd80454-e583-4f9b-87b6-378015d4016b-kube-api-access-c5dqf" (OuterVolumeSpecName: "kube-api-access-c5dqf") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "kube-api-access-c5dqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.315279 4732 scope.go:117] "RemoveContainer" containerID="4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.327176 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-scripts" (OuterVolumeSpecName: "scripts") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.336955 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.361340 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.384131 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.407101 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.407130 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.407139 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.407149 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5dqf\" (UniqueName: \"kubernetes.io/projected/3bd80454-e583-4f9b-87b6-378015d4016b-kube-api-access-c5dqf\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.407159 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.407167 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.407175 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bd80454-e583-4f9b-87b6-378015d4016b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.410610 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-config-data" (OuterVolumeSpecName: "config-data") pod "3bd80454-e583-4f9b-87b6-378015d4016b" (UID: "3bd80454-e583-4f9b-87b6-378015d4016b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.473722 4732 scope.go:117] "RemoveContainer" containerID="6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.504585 4732 scope.go:117] "RemoveContainer" containerID="f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed" Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.504977 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed\": container with ID starting with f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed not found: ID does not exist" containerID="f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.505017 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed"} err="failed to get container status \"f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed\": rpc error: code = NotFound desc = could not find container \"f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed\": container with ID starting with f70189c6005bf0e37383f05a24a1ab81a575f1e85003b567761271639f9894ed not found: ID does not exist" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.505047 4732 scope.go:117] "RemoveContainer" containerID="da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e" Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.505490 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e\": container with ID starting with da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e not found: ID does not exist" containerID="da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.505512 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e"} err="failed to get container status \"da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e\": rpc error: code = NotFound desc = could not find container \"da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e\": container with ID starting with da001358706fff9a5e9b39dc49f39c60fa7de6e0984f1192880a3a9605957c9e not found: ID does not exist" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.505524 4732 scope.go:117] "RemoveContainer" containerID="4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a" Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.505870 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a\": container with ID starting with 4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a not found: ID does not exist" containerID="4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.505907 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a"} err="failed to get container status \"4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a\": rpc error: code = NotFound desc = could not find container \"4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a\": container with ID starting with 4ecda5fd05af4f84f4d74847609d7ea5f19aef77fd3bff1b87b1c0fe1faff88a not found: ID does not exist" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.505928 4732 scope.go:117] "RemoveContainer" containerID="6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc" Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.506279 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc\": container with ID starting with 6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc not found: ID does not exist" containerID="6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.506310 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc"} err="failed to get container status \"6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc\": rpc error: code = NotFound desc = could not find container \"6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc\": container with ID starting with 6cae6d7b2704e4beec41fffafbb83ac72f0e0e6cffbe9df51ebe0daa04e4a9dc not found: ID does not exist" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.508732 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bd80454-e583-4f9b-87b6-378015d4016b-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.623153 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.633927 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.654878 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.655351 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="sg-core" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655377 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="sg-core" Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.655410 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-central-agent" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655418 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-central-agent" Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.655442 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-notification-agent" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655450 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-notification-agent" Oct 10 07:12:29 crc kubenswrapper[4732]: E1010 07:12:29.655465 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="proxy-httpd" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655472 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="proxy-httpd" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655880 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="sg-core" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655900 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="proxy-httpd" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655921 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-central-agent" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.655929 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" containerName="ceilometer-notification-agent" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.658068 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.663499 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.663615 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.663855 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.686813 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd80454-e583-4f9b-87b6-378015d4016b" path="/var/lib/kubelet/pods/3bd80454-e583-4f9b-87b6-378015d4016b/volumes" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.699672 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.714777 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-run-httpd\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.714927 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.715088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-scripts\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.715135 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl4hr\" (UniqueName: \"kubernetes.io/projected/ed592ee3-6dab-41d4-8141-bb7c31b02f73-kube-api-access-rl4hr\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.715186 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-config-data\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.715211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.715267 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.715339 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-log-httpd\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.816749 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-config-data\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.816836 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.816891 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.816950 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-log-httpd\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.817005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-run-httpd\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.817104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.817227 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-scripts\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.817271 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl4hr\" (UniqueName: \"kubernetes.io/projected/ed592ee3-6dab-41d4-8141-bb7c31b02f73-kube-api-access-rl4hr\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.817606 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-run-httpd\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.817601 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-log-httpd\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.820790 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.822286 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-config-data\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.822405 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-scripts\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.822999 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.830254 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.834736 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl4hr\" (UniqueName: \"kubernetes.io/projected/ed592ee3-6dab-41d4-8141-bb7c31b02f73-kube-api-access-rl4hr\") pod \"ceilometer-0\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " pod="openstack/ceilometer-0" Oct 10 07:12:29 crc kubenswrapper[4732]: I1010 07:12:29.997738 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:12:30 crc kubenswrapper[4732]: I1010 07:12:30.463635 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:12:30 crc kubenswrapper[4732]: W1010 07:12:30.467169 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded592ee3_6dab_41d4_8141_bb7c31b02f73.slice/crio-250e04f22958b91dd371dc791936e49052929cd02de81d757e6bcde53c1a602f WatchSource:0}: Error finding container 250e04f22958b91dd371dc791936e49052929cd02de81d757e6bcde53c1a602f: Status 404 returned error can't find the container with id 250e04f22958b91dd371dc791936e49052929cd02de81d757e6bcde53c1a602f Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.034422 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.144742 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-config-data\") pod \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.144850 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-combined-ca-bundle\") pod \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.144996 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnzz2\" (UniqueName: \"kubernetes.io/projected/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-kube-api-access-cnzz2\") pod \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.145083 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-logs\") pod \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\" (UID: \"54e3ee56-9d7c-4069-aa57-a5f4ed41c615\") " Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.146221 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-logs" (OuterVolumeSpecName: "logs") pod "54e3ee56-9d7c-4069-aa57-a5f4ed41c615" (UID: "54e3ee56-9d7c-4069-aa57-a5f4ed41c615"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.149896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-kube-api-access-cnzz2" (OuterVolumeSpecName: "kube-api-access-cnzz2") pod "54e3ee56-9d7c-4069-aa57-a5f4ed41c615" (UID: "54e3ee56-9d7c-4069-aa57-a5f4ed41c615"). InnerVolumeSpecName "kube-api-access-cnzz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.179311 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54e3ee56-9d7c-4069-aa57-a5f4ed41c615" (UID: "54e3ee56-9d7c-4069-aa57-a5f4ed41c615"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.179968 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-config-data" (OuterVolumeSpecName: "config-data") pod "54e3ee56-9d7c-4069-aa57-a5f4ed41c615" (UID: "54e3ee56-9d7c-4069-aa57-a5f4ed41c615"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.247043 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.247072 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.247084 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnzz2\" (UniqueName: \"kubernetes.io/projected/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-kube-api-access-cnzz2\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.247093 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e3ee56-9d7c-4069-aa57-a5f4ed41c615-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.289660 4732 generic.go:334] "Generic (PLEG): container finished" podID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerID="4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff" exitCode=0 Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.289761 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54e3ee56-9d7c-4069-aa57-a5f4ed41c615","Type":"ContainerDied","Data":"4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff"} Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.289789 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"54e3ee56-9d7c-4069-aa57-a5f4ed41c615","Type":"ContainerDied","Data":"b4e61288c6f66a6e4f25e4589e2222d66559c7a1de9ce7d732ffa94b465cceda"} Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.289809 4732 scope.go:117] "RemoveContainer" containerID="4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.289938 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.300591 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerStarted","Data":"03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12"} Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.300633 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerStarted","Data":"250e04f22958b91dd371dc791936e49052929cd02de81d757e6bcde53c1a602f"} Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.331212 4732 scope.go:117] "RemoveContainer" containerID="31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.332847 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.342025 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.381074 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:31 crc kubenswrapper[4732]: E1010 07:12:31.384283 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-log" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.384321 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-log" Oct 10 07:12:31 crc kubenswrapper[4732]: E1010 07:12:31.384340 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-api" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.384349 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-api" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.384545 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-api" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.384562 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" containerName="nova-api-log" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.385641 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.386060 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.397439 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.397618 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.397713 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.406744 4732 scope.go:117] "RemoveContainer" containerID="4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff" Oct 10 07:12:31 crc kubenswrapper[4732]: E1010 07:12:31.408002 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff\": container with ID starting with 4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff not found: ID does not exist" containerID="4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.408068 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff"} err="failed to get container status \"4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff\": rpc error: code = NotFound desc = could not find container \"4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff\": container with ID starting with 4913c2d1a9714208a3ed5cd9f2249844c412aed536414c200581dd941c3874ff not found: ID does not exist" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.408102 4732 scope.go:117] "RemoveContainer" containerID="31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a" Oct 10 07:12:31 crc kubenswrapper[4732]: E1010 07:12:31.408869 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a\": container with ID starting with 31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a not found: ID does not exist" containerID="31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.408915 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a"} err="failed to get container status \"31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a\": rpc error: code = NotFound desc = could not find container \"31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a\": container with ID starting with 31450305b9c7557796d4ff16f3a76661694c9e750b9ad29aa05be28980b3d78a not found: ID does not exist" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.557355 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxbj6\" (UniqueName: \"kubernetes.io/projected/0e5b6335-15eb-468a-90fe-9c2586c89195-kube-api-access-mxbj6\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.557411 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e5b6335-15eb-468a-90fe-9c2586c89195-logs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.557442 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.557504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.557551 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.557599 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-config-data\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.659027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.659107 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.661944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-config-data\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.662232 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxbj6\" (UniqueName: \"kubernetes.io/projected/0e5b6335-15eb-468a-90fe-9c2586c89195-kube-api-access-mxbj6\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.662308 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e5b6335-15eb-468a-90fe-9c2586c89195-logs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.662355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.662972 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e5b6335-15eb-468a-90fe-9c2586c89195-logs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.663516 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-public-tls-certs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.667284 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-config-data\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.667798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.668920 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.674059 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e3ee56-9d7c-4069-aa57-a5f4ed41c615" path="/var/lib/kubelet/pods/54e3ee56-9d7c-4069-aa57-a5f4ed41c615/volumes" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.680798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxbj6\" (UniqueName: \"kubernetes.io/projected/0e5b6335-15eb-468a-90fe-9c2586c89195-kube-api-access-mxbj6\") pod \"nova-api-0\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.709182 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.870488 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:31 crc kubenswrapper[4732]: I1010 07:12:31.889060 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.199567 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:32 crc kubenswrapper[4732]: W1010 07:12:32.206896 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e5b6335_15eb_468a_90fe_9c2586c89195.slice/crio-58a087058eccce226bda9ec971983563d9276b7c2ff56bc1cf421962f6a93d71 WatchSource:0}: Error finding container 58a087058eccce226bda9ec971983563d9276b7c2ff56bc1cf421962f6a93d71: Status 404 returned error can't find the container with id 58a087058eccce226bda9ec971983563d9276b7c2ff56bc1cf421962f6a93d71 Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.314438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerStarted","Data":"d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812"} Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.315729 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e5b6335-15eb-468a-90fe-9c2586c89195","Type":"ContainerStarted","Data":"58a087058eccce226bda9ec971983563d9276b7c2ff56bc1cf421962f6a93d71"} Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.364777 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.610954 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xdr4l"] Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.612276 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.617717 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.617919 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.620022 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xdr4l"] Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.687916 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khspr\" (UniqueName: \"kubernetes.io/projected/ba5a1acc-920d-437b-b952-80e1cb9fc587-kube-api-access-khspr\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.687949 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.688157 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-config-data\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.688197 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-scripts\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.789328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-config-data\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.789383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-scripts\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.789418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khspr\" (UniqueName: \"kubernetes.io/projected/ba5a1acc-920d-437b-b952-80e1cb9fc587-kube-api-access-khspr\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.789461 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.794521 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-config-data\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.794608 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.797203 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-scripts\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.812403 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khspr\" (UniqueName: \"kubernetes.io/projected/ba5a1acc-920d-437b-b952-80e1cb9fc587-kube-api-access-khspr\") pod \"nova-cell1-cell-mapping-xdr4l\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:32 crc kubenswrapper[4732]: I1010 07:12:32.935759 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:33 crc kubenswrapper[4732]: I1010 07:12:33.345788 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerStarted","Data":"7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd"} Oct 10 07:12:33 crc kubenswrapper[4732]: I1010 07:12:33.349191 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e5b6335-15eb-468a-90fe-9c2586c89195","Type":"ContainerStarted","Data":"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc"} Oct 10 07:12:33 crc kubenswrapper[4732]: I1010 07:12:33.349276 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e5b6335-15eb-468a-90fe-9c2586c89195","Type":"ContainerStarted","Data":"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42"} Oct 10 07:12:33 crc kubenswrapper[4732]: I1010 07:12:33.378824 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.378802249 podStartE2EDuration="2.378802249s" podCreationTimestamp="2025-10-10 07:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:33.366262247 +0000 UTC m=+1280.435853508" watchObservedRunningTime="2025-10-10 07:12:33.378802249 +0000 UTC m=+1280.448393490" Oct 10 07:12:33 crc kubenswrapper[4732]: I1010 07:12:33.405856 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xdr4l"] Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.362763 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerStarted","Data":"f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1"} Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.363255 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.366487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xdr4l" event={"ID":"ba5a1acc-920d-437b-b952-80e1cb9fc587","Type":"ContainerStarted","Data":"d540a221c0791b2252e725e31d91ce55eb96da863e87dd65bb8ea6b3079291e6"} Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.366514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xdr4l" event={"ID":"ba5a1acc-920d-437b-b952-80e1cb9fc587","Type":"ContainerStarted","Data":"5f2198a19c79cf2d80aa57916dcf15116d687186ff065f74e954f20ba0854e97"} Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.394707 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.193395096 podStartE2EDuration="5.39466885s" podCreationTimestamp="2025-10-10 07:12:29 +0000 UTC" firstStartedPulling="2025-10-10 07:12:30.47035762 +0000 UTC m=+1277.539948861" lastFinishedPulling="2025-10-10 07:12:33.671631364 +0000 UTC m=+1280.741222615" observedRunningTime="2025-10-10 07:12:34.386876058 +0000 UTC m=+1281.456467309" watchObservedRunningTime="2025-10-10 07:12:34.39466885 +0000 UTC m=+1281.464260081" Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.404804 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xdr4l" podStartSLOduration=2.404786516 podStartE2EDuration="2.404786516s" podCreationTimestamp="2025-10-10 07:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:34.401235469 +0000 UTC m=+1281.470826720" watchObservedRunningTime="2025-10-10 07:12:34.404786516 +0000 UTC m=+1281.474377757" Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.744911 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.841961 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b7586c88c-6ch46"] Oct 10 07:12:34 crc kubenswrapper[4732]: I1010 07:12:34.842268 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" podUID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerName="dnsmasq-dns" containerID="cri-o://b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297" gracePeriod=10 Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.371715 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.375897 4732 generic.go:334] "Generic (PLEG): container finished" podID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerID="b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297" exitCode=0 Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.375933 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" event={"ID":"a634b425-c334-4ffb-9ea0-f8deab7c9b00","Type":"ContainerDied","Data":"b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297"} Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.375952 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.375965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b7586c88c-6ch46" event={"ID":"a634b425-c334-4ffb-9ea0-f8deab7c9b00","Type":"ContainerDied","Data":"b7ea7ed5793e11dbda6361a46c75fca0201ee91ad9487007cc7a05b4c0e37a60"} Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.375983 4732 scope.go:117] "RemoveContainer" containerID="b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.407148 4732 scope.go:117] "RemoveContainer" containerID="0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.446163 4732 scope.go:117] "RemoveContainer" containerID="b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297" Oct 10 07:12:35 crc kubenswrapper[4732]: E1010 07:12:35.450141 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297\": container with ID starting with b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297 not found: ID does not exist" containerID="b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.450180 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297"} err="failed to get container status \"b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297\": rpc error: code = NotFound desc = could not find container \"b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297\": container with ID starting with b8d0903a6429f2bbb0f9d2b1bcde46d1ed297f02bcb81d15c3f6ab68b6020297 not found: ID does not exist" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.450209 4732 scope.go:117] "RemoveContainer" containerID="0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97" Oct 10 07:12:35 crc kubenswrapper[4732]: E1010 07:12:35.453885 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97\": container with ID starting with 0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97 not found: ID does not exist" containerID="0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.453908 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97"} err="failed to get container status \"0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97\": rpc error: code = NotFound desc = could not find container \"0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97\": container with ID starting with 0270e7534c1c4871871873ba6cadfb399004990b1dbde6fead183776f0716d97 not found: ID does not exist" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.542610 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-config\") pod \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.542661 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-sb\") pod \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.542685 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-svc\") pod \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.542753 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2srjq\" (UniqueName: \"kubernetes.io/projected/a634b425-c334-4ffb-9ea0-f8deab7c9b00-kube-api-access-2srjq\") pod \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.542771 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-nb\") pod \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.542820 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-swift-storage-0\") pod \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\" (UID: \"a634b425-c334-4ffb-9ea0-f8deab7c9b00\") " Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.549360 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a634b425-c334-4ffb-9ea0-f8deab7c9b00-kube-api-access-2srjq" (OuterVolumeSpecName: "kube-api-access-2srjq") pod "a634b425-c334-4ffb-9ea0-f8deab7c9b00" (UID: "a634b425-c334-4ffb-9ea0-f8deab7c9b00"). InnerVolumeSpecName "kube-api-access-2srjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.601427 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-config" (OuterVolumeSpecName: "config") pod "a634b425-c334-4ffb-9ea0-f8deab7c9b00" (UID: "a634b425-c334-4ffb-9ea0-f8deab7c9b00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.601895 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a634b425-c334-4ffb-9ea0-f8deab7c9b00" (UID: "a634b425-c334-4ffb-9ea0-f8deab7c9b00"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.613108 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a634b425-c334-4ffb-9ea0-f8deab7c9b00" (UID: "a634b425-c334-4ffb-9ea0-f8deab7c9b00"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.623320 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a634b425-c334-4ffb-9ea0-f8deab7c9b00" (UID: "a634b425-c334-4ffb-9ea0-f8deab7c9b00"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.645129 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a634b425-c334-4ffb-9ea0-f8deab7c9b00" (UID: "a634b425-c334-4ffb-9ea0-f8deab7c9b00"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.645680 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2srjq\" (UniqueName: \"kubernetes.io/projected/a634b425-c334-4ffb-9ea0-f8deab7c9b00-kube-api-access-2srjq\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.645876 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.645888 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.645917 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.645930 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.645941 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a634b425-c334-4ffb-9ea0-f8deab7c9b00-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.729650 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b7586c88c-6ch46"] Oct 10 07:12:35 crc kubenswrapper[4732]: I1010 07:12:35.739524 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b7586c88c-6ch46"] Oct 10 07:12:37 crc kubenswrapper[4732]: I1010 07:12:37.678882 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" path="/var/lib/kubelet/pods/a634b425-c334-4ffb-9ea0-f8deab7c9b00/volumes" Oct 10 07:12:38 crc kubenswrapper[4732]: I1010 07:12:38.410766 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba5a1acc-920d-437b-b952-80e1cb9fc587" containerID="d540a221c0791b2252e725e31d91ce55eb96da863e87dd65bb8ea6b3079291e6" exitCode=0 Oct 10 07:12:38 crc kubenswrapper[4732]: I1010 07:12:38.411189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xdr4l" event={"ID":"ba5a1acc-920d-437b-b952-80e1cb9fc587","Type":"ContainerDied","Data":"d540a221c0791b2252e725e31d91ce55eb96da863e87dd65bb8ea6b3079291e6"} Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:39.877461 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.035825 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-combined-ca-bundle\") pod \"ba5a1acc-920d-437b-b952-80e1cb9fc587\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.035899 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-config-data\") pod \"ba5a1acc-920d-437b-b952-80e1cb9fc587\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.035985 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khspr\" (UniqueName: \"kubernetes.io/projected/ba5a1acc-920d-437b-b952-80e1cb9fc587-kube-api-access-khspr\") pod \"ba5a1acc-920d-437b-b952-80e1cb9fc587\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.036206 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-scripts\") pod \"ba5a1acc-920d-437b-b952-80e1cb9fc587\" (UID: \"ba5a1acc-920d-437b-b952-80e1cb9fc587\") " Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.043025 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5a1acc-920d-437b-b952-80e1cb9fc587-kube-api-access-khspr" (OuterVolumeSpecName: "kube-api-access-khspr") pod "ba5a1acc-920d-437b-b952-80e1cb9fc587" (UID: "ba5a1acc-920d-437b-b952-80e1cb9fc587"). InnerVolumeSpecName "kube-api-access-khspr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.043740 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-scripts" (OuterVolumeSpecName: "scripts") pod "ba5a1acc-920d-437b-b952-80e1cb9fc587" (UID: "ba5a1acc-920d-437b-b952-80e1cb9fc587"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.065232 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-config-data" (OuterVolumeSpecName: "config-data") pod "ba5a1acc-920d-437b-b952-80e1cb9fc587" (UID: "ba5a1acc-920d-437b-b952-80e1cb9fc587"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.065585 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba5a1acc-920d-437b-b952-80e1cb9fc587" (UID: "ba5a1acc-920d-437b-b952-80e1cb9fc587"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.138970 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.139011 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.139032 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khspr\" (UniqueName: \"kubernetes.io/projected/ba5a1acc-920d-437b-b952-80e1cb9fc587-kube-api-access-khspr\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.139048 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba5a1acc-920d-437b-b952-80e1cb9fc587-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.437617 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xdr4l" event={"ID":"ba5a1acc-920d-437b-b952-80e1cb9fc587","Type":"ContainerDied","Data":"5f2198a19c79cf2d80aa57916dcf15116d687186ff065f74e954f20ba0854e97"} Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.437666 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f2198a19c79cf2d80aa57916dcf15116d687186ff065f74e954f20ba0854e97" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.437760 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xdr4l" Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.683913 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.684393 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fb7a871c-04b4-491b-a767-b01a2b3b38cf" containerName="nova-scheduler-scheduler" containerID="cri-o://21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926" gracePeriod=30 Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.693964 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.694237 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-log" containerID="cri-o://b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42" gracePeriod=30 Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.694389 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-api" containerID="cri-o://28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc" gracePeriod=30 Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.754949 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.755209 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-log" containerID="cri-o://86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80" gracePeriod=30 Oct 10 07:12:40 crc kubenswrapper[4732]: I1010 07:12:40.755294 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-metadata" containerID="cri-o://86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603" gracePeriod=30 Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.251147 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.359442 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e5b6335-15eb-468a-90fe-9c2586c89195-logs\") pod \"0e5b6335-15eb-468a-90fe-9c2586c89195\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.359483 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-internal-tls-certs\") pod \"0e5b6335-15eb-468a-90fe-9c2586c89195\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.359567 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-combined-ca-bundle\") pod \"0e5b6335-15eb-468a-90fe-9c2586c89195\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.359613 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxbj6\" (UniqueName: \"kubernetes.io/projected/0e5b6335-15eb-468a-90fe-9c2586c89195-kube-api-access-mxbj6\") pod \"0e5b6335-15eb-468a-90fe-9c2586c89195\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.359720 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-config-data\") pod \"0e5b6335-15eb-468a-90fe-9c2586c89195\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.359789 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-public-tls-certs\") pod \"0e5b6335-15eb-468a-90fe-9c2586c89195\" (UID: \"0e5b6335-15eb-468a-90fe-9c2586c89195\") " Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.361095 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e5b6335-15eb-468a-90fe-9c2586c89195-logs" (OuterVolumeSpecName: "logs") pod "0e5b6335-15eb-468a-90fe-9c2586c89195" (UID: "0e5b6335-15eb-468a-90fe-9c2586c89195"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.365990 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5b6335-15eb-468a-90fe-9c2586c89195-kube-api-access-mxbj6" (OuterVolumeSpecName: "kube-api-access-mxbj6") pod "0e5b6335-15eb-468a-90fe-9c2586c89195" (UID: "0e5b6335-15eb-468a-90fe-9c2586c89195"). InnerVolumeSpecName "kube-api-access-mxbj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.391047 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-config-data" (OuterVolumeSpecName: "config-data") pod "0e5b6335-15eb-468a-90fe-9c2586c89195" (UID: "0e5b6335-15eb-468a-90fe-9c2586c89195"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.401061 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e5b6335-15eb-468a-90fe-9c2586c89195" (UID: "0e5b6335-15eb-468a-90fe-9c2586c89195"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.420081 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0e5b6335-15eb-468a-90fe-9c2586c89195" (UID: "0e5b6335-15eb-468a-90fe-9c2586c89195"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.439970 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e5b6335-15eb-468a-90fe-9c2586c89195" (UID: "0e5b6335-15eb-468a-90fe-9c2586c89195"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.449455 4732 generic.go:334] "Generic (PLEG): container finished" podID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerID="28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc" exitCode=0 Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.449533 4732 generic.go:334] "Generic (PLEG): container finished" podID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerID="b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42" exitCode=143 Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.449576 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e5b6335-15eb-468a-90fe-9c2586c89195","Type":"ContainerDied","Data":"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc"} Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.449627 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e5b6335-15eb-468a-90fe-9c2586c89195","Type":"ContainerDied","Data":"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42"} Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.449641 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e5b6335-15eb-468a-90fe-9c2586c89195","Type":"ContainerDied","Data":"58a087058eccce226bda9ec971983563d9276b7c2ff56bc1cf421962f6a93d71"} Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.449658 4732 scope.go:117] "RemoveContainer" containerID="28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.449898 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.453639 4732 generic.go:334] "Generic (PLEG): container finished" podID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerID="86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80" exitCode=143 Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.453710 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a49f230a-83df-4640-92f3-da3da9cf3f1c","Type":"ContainerDied","Data":"86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80"} Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.463894 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e5b6335-15eb-468a-90fe-9c2586c89195-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.463925 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.463939 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.463952 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxbj6\" (UniqueName: \"kubernetes.io/projected/0e5b6335-15eb-468a-90fe-9c2586c89195-kube-api-access-mxbj6\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.463991 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.464002 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e5b6335-15eb-468a-90fe-9c2586c89195-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.478889 4732 scope.go:117] "RemoveContainer" containerID="b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.489793 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.501410 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.519918 4732 scope.go:117] "RemoveContainer" containerID="28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc" Oct 10 07:12:41 crc kubenswrapper[4732]: E1010 07:12:41.520388 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc\": container with ID starting with 28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc not found: ID does not exist" containerID="28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.520438 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc"} err="failed to get container status \"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc\": rpc error: code = NotFound desc = could not find container \"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc\": container with ID starting with 28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc not found: ID does not exist" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.520466 4732 scope.go:117] "RemoveContainer" containerID="b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.520878 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:41 crc kubenswrapper[4732]: E1010 07:12:41.520904 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42\": container with ID starting with b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42 not found: ID does not exist" containerID="b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.520934 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42"} err="failed to get container status \"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42\": rpc error: code = NotFound desc = could not find container \"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42\": container with ID starting with b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42 not found: ID does not exist" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.520954 4732 scope.go:117] "RemoveContainer" containerID="28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc" Oct 10 07:12:41 crc kubenswrapper[4732]: E1010 07:12:41.521254 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-api" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521273 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-api" Oct 10 07:12:41 crc kubenswrapper[4732]: E1010 07:12:41.521297 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerName="init" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521304 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerName="init" Oct 10 07:12:41 crc kubenswrapper[4732]: E1010 07:12:41.521318 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5a1acc-920d-437b-b952-80e1cb9fc587" containerName="nova-manage" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521324 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5a1acc-920d-437b-b952-80e1cb9fc587" containerName="nova-manage" Oct 10 07:12:41 crc kubenswrapper[4732]: E1010 07:12:41.521355 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-log" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521361 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-log" Oct 10 07:12:41 crc kubenswrapper[4732]: E1010 07:12:41.521393 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerName="dnsmasq-dns" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521399 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerName="dnsmasq-dns" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521307 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc"} err="failed to get container status \"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc\": rpc error: code = NotFound desc = could not find container \"28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc\": container with ID starting with 28de33529efbacac2cf6a177192c839823b1dff3e4ed15f972f10559f6c454dc not found: ID does not exist" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521489 4732 scope.go:117] "RemoveContainer" containerID="b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521548 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5a1acc-920d-437b-b952-80e1cb9fc587" containerName="nova-manage" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521570 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-log" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521586 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" containerName="nova-api-api" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521597 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a634b425-c334-4ffb-9ea0-f8deab7c9b00" containerName="dnsmasq-dns" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.521830 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42"} err="failed to get container status \"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42\": rpc error: code = NotFound desc = could not find container \"b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42\": container with ID starting with b68c38e8cb1acd1ab646d3b39d9681734338bf8df9158721f47e9d8952647b42 not found: ID does not exist" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.522725 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.527124 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.532394 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.532751 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.547200 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.668046 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-499mx\" (UniqueName: \"kubernetes.io/projected/56077f87-ea67-4080-b328-7186a7d0bf35-kube-api-access-499mx\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.668320 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56077f87-ea67-4080-b328-7186a7d0bf35-logs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.668461 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-config-data\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.668589 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.668687 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-public-tls-certs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.669112 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.673609 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e5b6335-15eb-468a-90fe-9c2586c89195" path="/var/lib/kubelet/pods/0e5b6335-15eb-468a-90fe-9c2586c89195/volumes" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.771077 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-public-tls-certs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.771156 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.771206 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-499mx\" (UniqueName: \"kubernetes.io/projected/56077f87-ea67-4080-b328-7186a7d0bf35-kube-api-access-499mx\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.771261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56077f87-ea67-4080-b328-7186a7d0bf35-logs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.771284 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-config-data\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.771874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56077f87-ea67-4080-b328-7186a7d0bf35-logs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.771401 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.774617 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-internal-tls-certs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.775296 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-config-data\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.775339 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.776443 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-public-tls-certs\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.799926 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-499mx\" (UniqueName: \"kubernetes.io/projected/56077f87-ea67-4080-b328-7186a7d0bf35-kube-api-access-499mx\") pod \"nova-api-0\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " pod="openstack/nova-api-0" Oct 10 07:12:41 crc kubenswrapper[4732]: I1010 07:12:41.854258 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:12:42 crc kubenswrapper[4732]: E1010 07:12:42.272051 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:12:42 crc kubenswrapper[4732]: E1010 07:12:42.274132 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:12:42 crc kubenswrapper[4732]: E1010 07:12:42.275443 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:12:42 crc kubenswrapper[4732]: E1010 07:12:42.275480 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fb7a871c-04b4-491b-a767-b01a2b3b38cf" containerName="nova-scheduler-scheduler" Oct 10 07:12:42 crc kubenswrapper[4732]: I1010 07:12:42.335348 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:12:42 crc kubenswrapper[4732]: W1010 07:12:42.343285 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56077f87_ea67_4080_b328_7186a7d0bf35.slice/crio-2938f4066e9be1324b7a1269147e5f3064581b90ac433508b7f277c852ecc5d3 WatchSource:0}: Error finding container 2938f4066e9be1324b7a1269147e5f3064581b90ac433508b7f277c852ecc5d3: Status 404 returned error can't find the container with id 2938f4066e9be1324b7a1269147e5f3064581b90ac433508b7f277c852ecc5d3 Oct 10 07:12:42 crc kubenswrapper[4732]: I1010 07:12:42.466600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56077f87-ea67-4080-b328-7186a7d0bf35","Type":"ContainerStarted","Data":"2938f4066e9be1324b7a1269147e5f3064581b90ac433508b7f277c852ecc5d3"} Oct 10 07:12:43 crc kubenswrapper[4732]: I1010 07:12:43.486380 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56077f87-ea67-4080-b328-7186a7d0bf35","Type":"ContainerStarted","Data":"521cf68574ce4f3c728de951f4d7a2e5c5c7da7a4c60aac8f61aad22cdb5008d"} Oct 10 07:12:43 crc kubenswrapper[4732]: I1010 07:12:43.486763 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56077f87-ea67-4080-b328-7186a7d0bf35","Type":"ContainerStarted","Data":"101a8d6951b18acd6a4c085a8be7084960161610e3440bd4359884be73a60369"} Oct 10 07:12:43 crc kubenswrapper[4732]: I1010 07:12:43.511850 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.511827382 podStartE2EDuration="2.511827382s" podCreationTimestamp="2025-10-10 07:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:43.510307231 +0000 UTC m=+1290.579898552" watchObservedRunningTime="2025-10-10 07:12:43.511827382 +0000 UTC m=+1290.581418633" Oct 10 07:12:43 crc kubenswrapper[4732]: I1010 07:12:43.906222 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:38046->10.217.0.189:8775: read: connection reset by peer" Oct 10 07:12:43 crc kubenswrapper[4732]: I1010 07:12:43.907005 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:38056->10.217.0.189:8775: read: connection reset by peer" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.448975 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.521126 4732 generic.go:334] "Generic (PLEG): container finished" podID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerID="86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603" exitCode=0 Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.521205 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a49f230a-83df-4640-92f3-da3da9cf3f1c","Type":"ContainerDied","Data":"86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603"} Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.521245 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.521307 4732 scope.go:117] "RemoveContainer" containerID="86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.521294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a49f230a-83df-4640-92f3-da3da9cf3f1c","Type":"ContainerDied","Data":"923a65f253976a6232c9504bfdfc01f4fabd9088892f3125c77d5881018872af"} Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.528023 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-nova-metadata-tls-certs\") pod \"a49f230a-83df-4640-92f3-da3da9cf3f1c\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.528202 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2p4x\" (UniqueName: \"kubernetes.io/projected/a49f230a-83df-4640-92f3-da3da9cf3f1c-kube-api-access-k2p4x\") pod \"a49f230a-83df-4640-92f3-da3da9cf3f1c\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.528262 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-config-data\") pod \"a49f230a-83df-4640-92f3-da3da9cf3f1c\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.528303 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a49f230a-83df-4640-92f3-da3da9cf3f1c-logs\") pod \"a49f230a-83df-4640-92f3-da3da9cf3f1c\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.528409 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-combined-ca-bundle\") pod \"a49f230a-83df-4640-92f3-da3da9cf3f1c\" (UID: \"a49f230a-83df-4640-92f3-da3da9cf3f1c\") " Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.529648 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a49f230a-83df-4640-92f3-da3da9cf3f1c-logs" (OuterVolumeSpecName: "logs") pod "a49f230a-83df-4640-92f3-da3da9cf3f1c" (UID: "a49f230a-83df-4640-92f3-da3da9cf3f1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.548850 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a49f230a-83df-4640-92f3-da3da9cf3f1c-kube-api-access-k2p4x" (OuterVolumeSpecName: "kube-api-access-k2p4x") pod "a49f230a-83df-4640-92f3-da3da9cf3f1c" (UID: "a49f230a-83df-4640-92f3-da3da9cf3f1c"). InnerVolumeSpecName "kube-api-access-k2p4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.566106 4732 scope.go:117] "RemoveContainer" containerID="86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.572396 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a49f230a-83df-4640-92f3-da3da9cf3f1c" (UID: "a49f230a-83df-4640-92f3-da3da9cf3f1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.590473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-config-data" (OuterVolumeSpecName: "config-data") pod "a49f230a-83df-4640-92f3-da3da9cf3f1c" (UID: "a49f230a-83df-4640-92f3-da3da9cf3f1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.610059 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a49f230a-83df-4640-92f3-da3da9cf3f1c" (UID: "a49f230a-83df-4640-92f3-da3da9cf3f1c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.626878 4732 scope.go:117] "RemoveContainer" containerID="86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603" Oct 10 07:12:44 crc kubenswrapper[4732]: E1010 07:12:44.627362 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603\": container with ID starting with 86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603 not found: ID does not exist" containerID="86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.627417 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603"} err="failed to get container status \"86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603\": rpc error: code = NotFound desc = could not find container \"86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603\": container with ID starting with 86030c0147169145b50d0736ef8d6097cada37ce902c13d2ba280e849e922603 not found: ID does not exist" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.627453 4732 scope.go:117] "RemoveContainer" containerID="86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80" Oct 10 07:12:44 crc kubenswrapper[4732]: E1010 07:12:44.627942 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80\": container with ID starting with 86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80 not found: ID does not exist" containerID="86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.627976 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80"} err="failed to get container status \"86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80\": rpc error: code = NotFound desc = could not find container \"86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80\": container with ID starting with 86aafbafecb9fab1243a01bdc3d8aececc28b07186d303c4f2fa82ef432a3a80 not found: ID does not exist" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.631309 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2p4x\" (UniqueName: \"kubernetes.io/projected/a49f230a-83df-4640-92f3-da3da9cf3f1c-kube-api-access-k2p4x\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.631329 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.631338 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a49f230a-83df-4640-92f3-da3da9cf3f1c-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.631346 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.631354 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a49f230a-83df-4640-92f3-da3da9cf3f1c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.866659 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.877424 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.887114 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:12:44 crc kubenswrapper[4732]: E1010 07:12:44.887490 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-metadata" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.887505 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-metadata" Oct 10 07:12:44 crc kubenswrapper[4732]: E1010 07:12:44.887515 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-log" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.887522 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-log" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.887681 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-log" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.887731 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" containerName="nova-metadata-metadata" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.889408 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.891542 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.891768 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 07:12:44 crc kubenswrapper[4732]: I1010 07:12:44.899847 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.040069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.040162 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f34ab2c-f804-4f24-a447-165d5afb984f-logs\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.040244 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxj5t\" (UniqueName: \"kubernetes.io/projected/8f34ab2c-f804-4f24-a447-165d5afb984f-kube-api-access-kxj5t\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.040282 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-config-data\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.040303 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.143892 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.144115 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f34ab2c-f804-4f24-a447-165d5afb984f-logs\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.144297 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxj5t\" (UniqueName: \"kubernetes.io/projected/8f34ab2c-f804-4f24-a447-165d5afb984f-kube-api-access-kxj5t\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.144455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-config-data\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.144501 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.144538 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f34ab2c-f804-4f24-a447-165d5afb984f-logs\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.148261 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.151521 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-config-data\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.152226 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.173132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxj5t\" (UniqueName: \"kubernetes.io/projected/8f34ab2c-f804-4f24-a447-165d5afb984f-kube-api-access-kxj5t\") pod \"nova-metadata-0\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.207518 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:12:45 crc kubenswrapper[4732]: W1010 07:12:45.673562 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f34ab2c_f804_4f24_a447_165d5afb984f.slice/crio-33e429c269fea0b0f07bca5a0059bd6f53ecb56ba9f6d7232aead904e4d6fd67 WatchSource:0}: Error finding container 33e429c269fea0b0f07bca5a0059bd6f53ecb56ba9f6d7232aead904e4d6fd67: Status 404 returned error can't find the container with id 33e429c269fea0b0f07bca5a0059bd6f53ecb56ba9f6d7232aead904e4d6fd67 Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.677784 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a49f230a-83df-4640-92f3-da3da9cf3f1c" path="/var/lib/kubelet/pods/a49f230a-83df-4640-92f3-da3da9cf3f1c/volumes" Oct 10 07:12:45 crc kubenswrapper[4732]: I1010 07:12:45.678767 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.546549 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f34ab2c-f804-4f24-a447-165d5afb984f","Type":"ContainerStarted","Data":"28679c08d1706b7a047ce63e0dbc74864b4cc97372d8f26b15d33225fb8a8912"} Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.546917 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f34ab2c-f804-4f24-a447-165d5afb984f","Type":"ContainerStarted","Data":"aaf10e61456ff1882f86011615c89d1ec3d16648bdf71900a294e95ad885a047"} Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.546941 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f34ab2c-f804-4f24-a447-165d5afb984f","Type":"ContainerStarted","Data":"33e429c269fea0b0f07bca5a0059bd6f53ecb56ba9f6d7232aead904e4d6fd67"} Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.549928 4732 generic.go:334] "Generic (PLEG): container finished" podID="fb7a871c-04b4-491b-a767-b01a2b3b38cf" containerID="21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926" exitCode=0 Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.549977 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7a871c-04b4-491b-a767-b01a2b3b38cf","Type":"ContainerDied","Data":"21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926"} Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.550007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fb7a871c-04b4-491b-a767-b01a2b3b38cf","Type":"ContainerDied","Data":"15e204034583164a598297706a7c88e6d819d089fbf7393124fcbc4530c8e349"} Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.550023 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15e204034583164a598297706a7c88e6d819d089fbf7393124fcbc4530c8e349" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.583887 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.583857652 podStartE2EDuration="2.583857652s" podCreationTimestamp="2025-10-10 07:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:46.572997856 +0000 UTC m=+1293.642589107" watchObservedRunningTime="2025-10-10 07:12:46.583857652 +0000 UTC m=+1293.653448893" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.618434 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.794401 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-combined-ca-bundle\") pod \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.795402 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dgks\" (UniqueName: \"kubernetes.io/projected/fb7a871c-04b4-491b-a767-b01a2b3b38cf-kube-api-access-9dgks\") pod \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.795431 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-config-data\") pod \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\" (UID: \"fb7a871c-04b4-491b-a767-b01a2b3b38cf\") " Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.801406 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7a871c-04b4-491b-a767-b01a2b3b38cf-kube-api-access-9dgks" (OuterVolumeSpecName: "kube-api-access-9dgks") pod "fb7a871c-04b4-491b-a767-b01a2b3b38cf" (UID: "fb7a871c-04b4-491b-a767-b01a2b3b38cf"). InnerVolumeSpecName "kube-api-access-9dgks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.837317 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb7a871c-04b4-491b-a767-b01a2b3b38cf" (UID: "fb7a871c-04b4-491b-a767-b01a2b3b38cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.838095 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-config-data" (OuterVolumeSpecName: "config-data") pod "fb7a871c-04b4-491b-a767-b01a2b3b38cf" (UID: "fb7a871c-04b4-491b-a767-b01a2b3b38cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.897635 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.897675 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dgks\" (UniqueName: \"kubernetes.io/projected/fb7a871c-04b4-491b-a767-b01a2b3b38cf-kube-api-access-9dgks\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:46 crc kubenswrapper[4732]: I1010 07:12:46.897717 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb7a871c-04b4-491b-a767-b01a2b3b38cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.557424 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.600556 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.640882 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.657706 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:47 crc kubenswrapper[4732]: E1010 07:12:47.658148 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7a871c-04b4-491b-a767-b01a2b3b38cf" containerName="nova-scheduler-scheduler" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.658169 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7a871c-04b4-491b-a767-b01a2b3b38cf" containerName="nova-scheduler-scheduler" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.658357 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7a871c-04b4-491b-a767-b01a2b3b38cf" containerName="nova-scheduler-scheduler" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.659062 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.663264 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.673332 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb7a871c-04b4-491b-a767-b01a2b3b38cf" path="/var/lib/kubelet/pods/fb7a871c-04b4-491b-a767-b01a2b3b38cf/volumes" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.674142 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.814271 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-config-data\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.814528 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.814759 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmvd8\" (UniqueName: \"kubernetes.io/projected/53c7a322-6bdd-4613-9a25-39391becbb81-kube-api-access-qmvd8\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.917290 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmvd8\" (UniqueName: \"kubernetes.io/projected/53c7a322-6bdd-4613-9a25-39391becbb81-kube-api-access-qmvd8\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.917533 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-config-data\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.917604 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.923619 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-config-data\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.923804 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.933253 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmvd8\" (UniqueName: \"kubernetes.io/projected/53c7a322-6bdd-4613-9a25-39391becbb81-kube-api-access-qmvd8\") pod \"nova-scheduler-0\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " pod="openstack/nova-scheduler-0" Oct 10 07:12:47 crc kubenswrapper[4732]: I1010 07:12:47.993523 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:12:48 crc kubenswrapper[4732]: I1010 07:12:48.487189 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:12:48 crc kubenswrapper[4732]: I1010 07:12:48.568513 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53c7a322-6bdd-4613-9a25-39391becbb81","Type":"ContainerStarted","Data":"3fa6fe5f18d94b502e14fd85cd60727f160f0c09b94c48b0078b09ca87abf84e"} Oct 10 07:12:49 crc kubenswrapper[4732]: I1010 07:12:49.579166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53c7a322-6bdd-4613-9a25-39391becbb81","Type":"ContainerStarted","Data":"33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711"} Oct 10 07:12:49 crc kubenswrapper[4732]: I1010 07:12:49.597300 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5972758479999998 podStartE2EDuration="2.597275848s" podCreationTimestamp="2025-10-10 07:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:12:49.595725196 +0000 UTC m=+1296.665316447" watchObservedRunningTime="2025-10-10 07:12:49.597275848 +0000 UTC m=+1296.666867099" Oct 10 07:12:50 crc kubenswrapper[4732]: I1010 07:12:50.214367 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:12:50 crc kubenswrapper[4732]: I1010 07:12:50.214815 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 07:12:51 crc kubenswrapper[4732]: I1010 07:12:51.855245 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:12:51 crc kubenswrapper[4732]: I1010 07:12:51.855766 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 07:12:52 crc kubenswrapper[4732]: I1010 07:12:52.892071 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:52 crc kubenswrapper[4732]: I1010 07:12:52.904104 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:52 crc kubenswrapper[4732]: I1010 07:12:52.993637 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.208078 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.208456 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.356345 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.356442 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.356513 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.357626 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58650635f1fbb6c2c9e22b572de1a4b4db4e63148a8b451b4d739ea558750b87"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.357788 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://58650635f1fbb6c2c9e22b572de1a4b4db4e63148a8b451b4d739ea558750b87" gracePeriod=600 Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.638970 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="58650635f1fbb6c2c9e22b572de1a4b4db4e63148a8b451b4d739ea558750b87" exitCode=0 Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.639036 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"58650635f1fbb6c2c9e22b572de1a4b4db4e63148a8b451b4d739ea558750b87"} Oct 10 07:12:55 crc kubenswrapper[4732]: I1010 07:12:55.639262 4732 scope.go:117] "RemoveContainer" containerID="239755fce5a5e3e2f7099222145032c91288a72b835ff1a7f25dd0d9f8c8d6b0" Oct 10 07:12:56 crc kubenswrapper[4732]: I1010 07:12:56.217892 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:56 crc kubenswrapper[4732]: I1010 07:12:56.218562 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 07:12:56 crc kubenswrapper[4732]: I1010 07:12:56.654902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3"} Oct 10 07:12:57 crc kubenswrapper[4732]: I1010 07:12:57.994272 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 07:12:58 crc kubenswrapper[4732]: I1010 07:12:58.043746 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 07:12:58 crc kubenswrapper[4732]: I1010 07:12:58.727162 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 07:13:00 crc kubenswrapper[4732]: I1010 07:13:00.008240 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 07:13:01 crc kubenswrapper[4732]: I1010 07:13:01.865364 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 07:13:01 crc kubenswrapper[4732]: I1010 07:13:01.866551 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 07:13:01 crc kubenswrapper[4732]: I1010 07:13:01.872148 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 07:13:01 crc kubenswrapper[4732]: I1010 07:13:01.877237 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 07:13:02 crc kubenswrapper[4732]: I1010 07:13:02.731107 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 07:13:02 crc kubenswrapper[4732]: I1010 07:13:02.739850 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 07:13:05 crc kubenswrapper[4732]: I1010 07:13:05.218525 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 07:13:05 crc kubenswrapper[4732]: I1010 07:13:05.219549 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 07:13:05 crc kubenswrapper[4732]: I1010 07:13:05.229512 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 07:13:05 crc kubenswrapper[4732]: I1010 07:13:05.231269 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.369795 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.370669 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c58c2bae-9347-4644-ae19-ff3781571610" containerName="openstackclient" containerID="cri-o://1df0bfc677e23cf3848c6b955400cf7ad114080934e2ff619ab58f33fe07c595" gracePeriod=2 Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.393898 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.496704 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:13:23 crc kubenswrapper[4732]: E1010 07:13:23.647569 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:23 crc kubenswrapper[4732]: E1010 07:13:23.647652 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data podName:88a11668-5ab6-4b77-8bb7-ac60140f4bd4 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:24.147631946 +0000 UTC m=+1331.217223187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data") pod "rabbitmq-cell1-server-0" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4") : configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.794300 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.794614 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="openstack-network-exporter" containerID="cri-o://695c8e21da8fa77374077cd2cc05c6d275fab9cd31581217ca2977b370adcc19" gracePeriod=300 Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.834455 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder62ae-account-delete-jd7lb"] Oct 10 07:13:23 crc kubenswrapper[4732]: E1010 07:13:23.834811 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58c2bae-9347-4644-ae19-ff3781571610" containerName="openstackclient" Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.834828 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58c2bae-9347-4644-ae19-ff3781571610" containerName="openstackclient" Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.834990 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58c2bae-9347-4644-ae19-ff3781571610" containerName="openstackclient" Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.840713 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder62ae-account-delete-jd7lb" Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.888178 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.915397 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder62ae-account-delete-jd7lb"] Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.985670 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement0e64-account-delete-6ncrf"] Oct 10 07:13:23 crc kubenswrapper[4732]: I1010 07:13:23.987147 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement0e64-account-delete-6ncrf" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.017706 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement0e64-account-delete-6ncrf"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.025617 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-q6pdc"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.038307 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-q6pdc"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.049282 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.049514 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="ovn-northd" containerID="cri-o://47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.049927 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="openstack-network-exporter" containerID="cri-o://1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.057655 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzq9\" (UniqueName: \"kubernetes.io/projected/80309f7c-d137-4116-a447-c9749c27c669-kube-api-access-4wzq9\") pod \"cinder62ae-account-delete-jd7lb\" (UID: \"80309f7c-d137-4116-a447-c9749c27c669\") " pod="openstack/cinder62ae-account-delete-jd7lb" Oct 10 07:13:24 crc kubenswrapper[4732]: E1010 07:13:24.058595 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 07:13:24 crc kubenswrapper[4732]: E1010 07:13:24.058642 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data podName:565f831c-0da8-4481-8461-8522e0cfa801 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:24.558628907 +0000 UTC m=+1331.628220148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data") pod "rabbitmq-server-0" (UID: "565f831c-0da8-4481-8461-8522e0cfa801") : configmap "rabbitmq-config-data" not found Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.077854 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="ovsdbserver-sb" containerID="cri-o://9babba80be3c1a6fd055e84387fb3c74f74af8c92bd4f83ebb53cf7d2b84b84d" gracePeriod=300 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.096770 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicand0a1-account-delete-5xtdn"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.098084 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand0a1-account-delete-5xtdn" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.120791 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicand0a1-account-delete-5xtdn"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.137816 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cx6h2"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.153978 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cx6h2"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.159103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph99j\" (UniqueName: \"kubernetes.io/projected/cb766f51-b132-4979-b32e-a2cfcb3edb50-kube-api-access-ph99j\") pod \"barbicand0a1-account-delete-5xtdn\" (UID: \"cb766f51-b132-4979-b32e-a2cfcb3edb50\") " pod="openstack/barbicand0a1-account-delete-5xtdn" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.159141 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx95j\" (UniqueName: \"kubernetes.io/projected/4ea62a47-1d15-41a2-a0d0-a0456a46183a-kube-api-access-kx95j\") pod \"placement0e64-account-delete-6ncrf\" (UID: \"4ea62a47-1d15-41a2-a0d0-a0456a46183a\") " pod="openstack/placement0e64-account-delete-6ncrf" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.159291 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzq9\" (UniqueName: \"kubernetes.io/projected/80309f7c-d137-4116-a447-c9749c27c669-kube-api-access-4wzq9\") pod \"cinder62ae-account-delete-jd7lb\" (UID: \"80309f7c-d137-4116-a447-c9749c27c669\") " pod="openstack/cinder62ae-account-delete-jd7lb" Oct 10 07:13:24 crc kubenswrapper[4732]: E1010 07:13:24.159643 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:24 crc kubenswrapper[4732]: E1010 07:13:24.159681 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data podName:88a11668-5ab6-4b77-8bb7-ac60140f4bd4 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:25.15966795 +0000 UTC m=+1332.229259181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data") pod "rabbitmq-cell1-server-0" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4") : configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.240699 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-n9v88"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.262116 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph99j\" (UniqueName: \"kubernetes.io/projected/cb766f51-b132-4979-b32e-a2cfcb3edb50-kube-api-access-ph99j\") pod \"barbicand0a1-account-delete-5xtdn\" (UID: \"cb766f51-b132-4979-b32e-a2cfcb3edb50\") " pod="openstack/barbicand0a1-account-delete-5xtdn" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.262318 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx95j\" (UniqueName: \"kubernetes.io/projected/4ea62a47-1d15-41a2-a0d0-a0456a46183a-kube-api-access-kx95j\") pod \"placement0e64-account-delete-6ncrf\" (UID: \"4ea62a47-1d15-41a2-a0d0-a0456a46183a\") " pod="openstack/placement0e64-account-delete-6ncrf" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.322885 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzq9\" (UniqueName: \"kubernetes.io/projected/80309f7c-d137-4116-a447-c9749c27c669-kube-api-access-4wzq9\") pod \"cinder62ae-account-delete-jd7lb\" (UID: \"80309f7c-d137-4116-a447-c9749c27c669\") " pod="openstack/cinder62ae-account-delete-jd7lb" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.340572 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glancedb70-account-delete-6srhp"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.349063 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx95j\" (UniqueName: \"kubernetes.io/projected/4ea62a47-1d15-41a2-a0d0-a0456a46183a-kube-api-access-kx95j\") pod \"placement0e64-account-delete-6ncrf\" (UID: \"4ea62a47-1d15-41a2-a0d0-a0456a46183a\") " pod="openstack/placement0e64-account-delete-6ncrf" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.351669 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancedb70-account-delete-6srhp" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.368264 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutrone3b5-account-delete-dtqm9"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.378548 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph99j\" (UniqueName: \"kubernetes.io/projected/cb766f51-b132-4979-b32e-a2cfcb3edb50-kube-api-access-ph99j\") pod \"barbicand0a1-account-delete-5xtdn\" (UID: \"cb766f51-b132-4979-b32e-a2cfcb3edb50\") " pod="openstack/barbicand0a1-account-delete-5xtdn" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.396188 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutrone3b5-account-delete-dtqm9" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.422604 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glancedb70-account-delete-6srhp"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.438595 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutrone3b5-account-delete-dtqm9"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.455339 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-qnzwq"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.455583 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-qnzwq" podUID="0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" containerName="openstack-network-exporter" containerID="cri-o://6a693cf87726dd07aef0243f81c2ca77c5d4545a90e8f3f043f685eaf87b6af5" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.464235 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand0a1-account-delete-5xtdn" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.478386 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lzkzk"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.485758 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder62ae-account-delete-jd7lb" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.497921 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-x6n9t"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.505835 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvrl\" (UniqueName: \"kubernetes.io/projected/03efe727-1f84-49e0-b6cb-a7189a02ba76-kube-api-access-7tvrl\") pod \"glancedb70-account-delete-6srhp\" (UID: \"03efe727-1f84-49e0-b6cb-a7189a02ba76\") " pod="openstack/glancedb70-account-delete-6srhp" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.506172 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxpgm\" (UniqueName: \"kubernetes.io/projected/5c751e0c-75c7-4aaf-bf32-55e6d022d802-kube-api-access-gxpgm\") pod \"neutrone3b5-account-delete-dtqm9\" (UID: \"5c751e0c-75c7-4aaf-bf32-55e6d022d802\") " pod="openstack/neutrone3b5-account-delete-dtqm9" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.517639 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-x6n9t"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.561204 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-nlxqk"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.587960 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-nlxqk"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.600444 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-9sdb9"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.619920 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvrl\" (UniqueName: \"kubernetes.io/projected/03efe727-1f84-49e0-b6cb-a7189a02ba76-kube-api-access-7tvrl\") pod \"glancedb70-account-delete-6srhp\" (UID: \"03efe727-1f84-49e0-b6cb-a7189a02ba76\") " pod="openstack/glancedb70-account-delete-6srhp" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.620982 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxpgm\" (UniqueName: \"kubernetes.io/projected/5c751e0c-75c7-4aaf-bf32-55e6d022d802-kube-api-access-gxpgm\") pod \"neutrone3b5-account-delete-dtqm9\" (UID: \"5c751e0c-75c7-4aaf-bf32-55e6d022d802\") " pod="openstack/neutrone3b5-account-delete-dtqm9" Oct 10 07:13:24 crc kubenswrapper[4732]: E1010 07:13:24.621445 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 07:13:24 crc kubenswrapper[4732]: E1010 07:13:24.621495 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data podName:565f831c-0da8-4481-8461-8522e0cfa801 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:25.621479586 +0000 UTC m=+1332.691070827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data") pod "rabbitmq-server-0" (UID: "565f831c-0da8-4481-8461-8522e0cfa801") : configmap "rabbitmq-config-data" not found Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.625793 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-9sdb9"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.626894 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement0e64-account-delete-6ncrf" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.650313 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxpgm\" (UniqueName: \"kubernetes.io/projected/5c751e0c-75c7-4aaf-bf32-55e6d022d802-kube-api-access-gxpgm\") pod \"neutrone3b5-account-delete-dtqm9\" (UID: \"5c751e0c-75c7-4aaf-bf32-55e6d022d802\") " pod="openstack/neutrone3b5-account-delete-dtqm9" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.656282 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-p5t8l"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.671683 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvrl\" (UniqueName: \"kubernetes.io/projected/03efe727-1f84-49e0-b6cb-a7189a02ba76-kube-api-access-7tvrl\") pod \"glancedb70-account-delete-6srhp\" (UID: \"03efe727-1f84-49e0-b6cb-a7189a02ba76\") " pod="openstack/glancedb70-account-delete-6srhp" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.676479 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-p5t8l"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.703881 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.704209 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="cinder-scheduler" containerID="cri-o://698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.704788 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="probe" containerID="cri-o://335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.739820 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.740048 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api-log" containerID="cri-o://10d139c980b60b956965d5489b41546b308f38d9be28a63508a7305b312c69c4" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.740171 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api" containerID="cri-o://f4d100a6f0ecddf29e4d78c67e5019b69856d21ffbb0b74a4b0262657c6c0304" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.779108 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glancedb70-account-delete-6srhp" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.793167 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi0b57-account-delete-95cjw"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.800178 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutrone3b5-account-delete-dtqm9" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.801557 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0b57-account-delete-95cjw" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.853128 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi0b57-account-delete-95cjw"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.933446 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkrt8\" (UniqueName: \"kubernetes.io/projected/c76de706-34bc-4b37-8492-3573c19e91c2-kube-api-access-xkrt8\") pod \"novaapi0b57-account-delete-95cjw\" (UID: \"c76de706-34bc-4b37-8492-3573c19e91c2\") " pod="openstack/novaapi0b57-account-delete-95cjw" Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.975761 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d445dfc98-wk5w4"] Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.976019 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d445dfc98-wk5w4" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-log" containerID="cri-o://9e5587596a7f6545f5ee41c7fe004abf66a409e2bfb223d64c2066b916dae202" gracePeriod=30 Oct 10 07:13:24 crc kubenswrapper[4732]: I1010 07:13:24.978593 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d445dfc98-wk5w4" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-api" containerID="cri-o://ba7a1f03f18ae86234997ab8ec3532045109f6ac2550d2fdf25633eb2d62be0a" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.035220 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkrt8\" (UniqueName: \"kubernetes.io/projected/c76de706-34bc-4b37-8492-3573c19e91c2-kube-api-access-xkrt8\") pod \"novaapi0b57-account-delete-95cjw\" (UID: \"c76de706-34bc-4b37-8492-3573c19e91c2\") " pod="openstack/novaapi0b57-account-delete-95cjw" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.038830 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell00878-account-delete-pjh75"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.046562 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00878-account-delete-pjh75" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.055957 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell00878-account-delete-pjh75"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.065208 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1a8d0-account-delete-tcwnj"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.066318 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1a8d0-account-delete-tcwnj" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.083811 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkrt8\" (UniqueName: \"kubernetes.io/projected/c76de706-34bc-4b37-8492-3573c19e91c2-kube-api-access-xkrt8\") pod \"novaapi0b57-account-delete-95cjw\" (UID: \"c76de706-34bc-4b37-8492-3573c19e91c2\") " pod="openstack/novaapi0b57-account-delete-95cjw" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.098969 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1a8d0-account-delete-tcwnj"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.125998 4732 generic.go:334] "Generic (PLEG): container finished" podID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerID="10d139c980b60b956965d5489b41546b308f38d9be28a63508a7305b312c69c4" exitCode=143 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.126059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb179b69-8c25-49b1-88b5-6c17953ffbcd","Type":"ContainerDied","Data":"10d139c980b60b956965d5489b41546b308f38d9be28a63508a7305b312c69c4"} Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.187855 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0b57-account-delete-95cjw" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.200874 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b59764b5c-95h4h"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.201143 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" podUID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerName="dnsmasq-dns" containerID="cri-o://6a5d33aa67dcba33d9cca4c7fdc0763242994866d812f7cf3aeebaaa727f6f35" gracePeriod=10 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.213657 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.213992 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="openstack-network-exporter" containerID="cri-o://3368687e64b8c0c613b949dc899a5ce3a9150e48d0028af9fccd8eb195d75f5b" gracePeriod=300 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.220520 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.220731 4732 generic.go:334] "Generic (PLEG): container finished" podID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerID="1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a" exitCode=2 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.220793 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1b967-cc4d-4092-87e9-64cbbc84be27","Type":"ContainerDied","Data":"1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a"} Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.220956 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-server" containerID="cri-o://aea23c87d7a0e1648589bbcd40543c7d5e8ccf5a80b3a896677fc3b317ec2dda" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221056 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="swift-recon-cron" containerID="cri-o://6aa268d1067b6515b564fbd351c694b7f8bd27f2ca765a2e848302e1ec2da0ec" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221108 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="rsync" containerID="cri-o://7c44462876be789a8e5caeabb0625c49ae5413ec6663dae73e6b157a5e977d76" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221525 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-updater" containerID="cri-o://43a213f53856bce5c190f44e7458e042262da79e1784f2045dda4e75dc3471b6" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221630 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-expirer" containerID="cri-o://c875be356f22174bd7fe912809d07ce631dcb17edd6d1d6aabc340d517cc6551" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221672 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-updater" containerID="cri-o://7740e27fefba27d5e80df5ff662cfd5fc4b86c96b608fa32c24f8d2b25cee4a2" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221730 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-auditor" containerID="cri-o://e2917273d26b808e5a8fc08c8152f588e5014472d4e7a647ebdbecedcba84fda" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221755 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-server" containerID="cri-o://4577435a75c7166b15559759271a9948adb5a88482a2db26d6c48d48b9208d39" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221784 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-replicator" containerID="cri-o://84783f363dda1053c7f032969b8a9b632ff711d6f0764371f0d881ee3ad20516" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221804 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-auditor" containerID="cri-o://bcadb525584dba5a9a1af302bfd2be19ff703a8c744b55dbf166f43746dfd5fa" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221822 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-server" containerID="cri-o://d4e49bf9ad485c0fe0bbb4a2dbc2f08f31e1f3158c54e7e7a0fa81f3f0046870" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221840 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-replicator" containerID="cri-o://ca80afd8ea95c25e8f07db4e28d154c1d53a72ce3f36789ae3ff9af29cf3a561" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221875 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-auditor" containerID="cri-o://04a8993decfa9c602d76b910a4eb75f9a4b7db875ca6bfa209f16244327dbd1a" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221884 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-replicator" containerID="cri-o://440271d54f3b94f368b668f0086f762ecb8f963317d10585e119ad50bf50d796" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.221910 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-reaper" containerID="cri-o://306e993d7c42ab68be7b6186fb51f97b059b5a7bcc1a130f1b6cecbe5bae570f" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.249045 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxkfm\" (UniqueName: \"kubernetes.io/projected/1f7ba305-07fd-408c-865f-463e3738e6cb-kube-api-access-pxkfm\") pod \"novacell00878-account-delete-pjh75\" (UID: \"1f7ba305-07fd-408c-865f-463e3738e6cb\") " pod="openstack/novacell00878-account-delete-pjh75" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.249340 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxxx\" (UniqueName: \"kubernetes.io/projected/56a05e0a-f30f-4b7c-b939-eba8d0094d48-kube-api-access-fhxxx\") pod \"novacell1a8d0-account-delete-tcwnj\" (UID: \"56a05e0a-f30f-4b7c-b939-eba8d0094d48\") " pod="openstack/novacell1a8d0-account-delete-tcwnj" Oct 10 07:13:25 crc kubenswrapper[4732]: E1010 07:13:25.273335 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:25 crc kubenswrapper[4732]: E1010 07:13:25.273421 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data podName:88a11668-5ab6-4b77-8bb7-ac60140f4bd4 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:27.273399072 +0000 UTC m=+1334.342990313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data") pod "rabbitmq-cell1-server-0" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4") : configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.303069 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_64dcf265-8f29-46bc-9b03-40dda51f606b/ovsdbserver-sb/0.log" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.303129 4732 generic.go:334] "Generic (PLEG): container finished" podID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerID="695c8e21da8fa77374077cd2cc05c6d275fab9cd31581217ca2977b370adcc19" exitCode=2 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.303163 4732 generic.go:334] "Generic (PLEG): container finished" podID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerID="9babba80be3c1a6fd055e84387fb3c74f74af8c92bd4f83ebb53cf7d2b84b84d" exitCode=143 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.303270 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"64dcf265-8f29-46bc-9b03-40dda51f606b","Type":"ContainerDied","Data":"695c8e21da8fa77374077cd2cc05c6d275fab9cd31581217ca2977b370adcc19"} Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.303298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"64dcf265-8f29-46bc-9b03-40dda51f606b","Type":"ContainerDied","Data":"9babba80be3c1a6fd055e84387fb3c74f74af8c92bd4f83ebb53cf7d2b84b84d"} Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.326257 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qnzwq_0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8/openstack-network-exporter/0.log" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.327340 4732 generic.go:334] "Generic (PLEG): container finished" podID="0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" containerID="6a693cf87726dd07aef0243f81c2ca77c5d4545a90e8f3f043f685eaf87b6af5" exitCode=2 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.327389 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qnzwq" event={"ID":"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8","Type":"ContainerDied","Data":"6a693cf87726dd07aef0243f81c2ca77c5d4545a90e8f3f043f685eaf87b6af5"} Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.364455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxxx\" (UniqueName: \"kubernetes.io/projected/56a05e0a-f30f-4b7c-b939-eba8d0094d48-kube-api-access-fhxxx\") pod \"novacell1a8d0-account-delete-tcwnj\" (UID: \"56a05e0a-f30f-4b7c-b939-eba8d0094d48\") " pod="openstack/novacell1a8d0-account-delete-tcwnj" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.364556 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxkfm\" (UniqueName: \"kubernetes.io/projected/1f7ba305-07fd-408c-865f-463e3738e6cb-kube-api-access-pxkfm\") pod \"novacell00878-account-delete-pjh75\" (UID: \"1f7ba305-07fd-408c-865f-463e3738e6cb\") " pod="openstack/novacell00878-account-delete-pjh75" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.460334 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxxx\" (UniqueName: \"kubernetes.io/projected/56a05e0a-f30f-4b7c-b939-eba8d0094d48-kube-api-access-fhxxx\") pod \"novacell1a8d0-account-delete-tcwnj\" (UID: \"56a05e0a-f30f-4b7c-b939-eba8d0094d48\") " pod="openstack/novacell1a8d0-account-delete-tcwnj" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.480802 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.498421 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxkfm\" (UniqueName: \"kubernetes.io/projected/1f7ba305-07fd-408c-865f-463e3738e6cb-kube-api-access-pxkfm\") pod \"novacell00878-account-delete-pjh75\" (UID: \"1f7ba305-07fd-408c-865f-463e3738e6cb\") " pod="openstack/novacell00878-account-delete-pjh75" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.541610 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" containerID="cri-o://004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" gracePeriod=29 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.569898 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.570102 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-log" containerID="cri-o://58d341eb205d877224a5cb6a46597c2de42b015fa39f99b361d6a4d67ded1cc4" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.570540 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-httpd" containerID="cri-o://f366ccf0fd7eff9163283eb01f40b778944fbee5750e2fdcbc35a6bd70d5f9a8" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.600879 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-785547cb47-x77nc"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.601314 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-785547cb47-x77nc" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-api" containerID="cri-o://2d672d5afe033cb8d13e4990cc214b86467df1b5080405aeaab34bdda430f497" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.601717 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-785547cb47-x77nc" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-httpd" containerID="cri-o://7c102b472c047348ed7dff4aff9894c0cde366c8c678aa7233770836081af19e" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.615203 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ng442"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.636460 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ng442"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.645733 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="ovsdbserver-nb" containerID="cri-o://ed3b3ed20134b6ded19f5781525993fc3a4c2c1ec91fdf5fb5344ce971381d40" gracePeriod=300 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.656290 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-62ae-account-create-tps8w"] Oct 10 07:13:25 crc kubenswrapper[4732]: E1010 07:13:25.681173 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 07:13:25 crc kubenswrapper[4732]: E1010 07:13:25.681224 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data podName:565f831c-0da8-4481-8461-8522e0cfa801 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:27.681209975 +0000 UTC m=+1334.750801216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data") pod "rabbitmq-server-0" (UID: "565f831c-0da8-4481-8461-8522e0cfa801") : configmap "rabbitmq-config-data" not found Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.684909 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerName="rabbitmq" containerID="cri-o://84a5b3ebb026e19550cf8d398201da96d41bac755d18b33cc928544d7a6cf2c5" gracePeriod=604800 Oct 10 07:13:25 crc kubenswrapper[4732]: E1010 07:13:25.750065 4732 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 10 07:13:25 crc kubenswrapper[4732]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 10 07:13:25 crc kubenswrapper[4732]: + source /usr/local/bin/container-scripts/functions Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNBridge=br-int Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNRemote=tcp:localhost:6642 Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNEncapType=geneve Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNAvailabilityZones= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ EnableChassisAsGateway=true Oct 10 07:13:25 crc kubenswrapper[4732]: ++ PhysicalNetworks= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNHostName= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 10 07:13:25 crc kubenswrapper[4732]: ++ ovs_dir=/var/lib/openvswitch Oct 10 07:13:25 crc kubenswrapper[4732]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 10 07:13:25 crc kubenswrapper[4732]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 10 07:13:25 crc kubenswrapper[4732]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + sleep 0.5 Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + sleep 0.5 Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + cleanup_ovsdb_server_semaphore Oct 10 07:13:25 crc kubenswrapper[4732]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 07:13:25 crc kubenswrapper[4732]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 10 07:13:25 crc kubenswrapper[4732]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-n9v88" message=< Oct 10 07:13:25 crc kubenswrapper[4732]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 10 07:13:25 crc kubenswrapper[4732]: + source /usr/local/bin/container-scripts/functions Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNBridge=br-int Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNRemote=tcp:localhost:6642 Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNEncapType=geneve Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNAvailabilityZones= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ EnableChassisAsGateway=true Oct 10 07:13:25 crc kubenswrapper[4732]: ++ PhysicalNetworks= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNHostName= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 10 07:13:25 crc kubenswrapper[4732]: ++ ovs_dir=/var/lib/openvswitch Oct 10 07:13:25 crc kubenswrapper[4732]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 10 07:13:25 crc kubenswrapper[4732]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 10 07:13:25 crc kubenswrapper[4732]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + sleep 0.5 Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + sleep 0.5 Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + cleanup_ovsdb_server_semaphore Oct 10 07:13:25 crc kubenswrapper[4732]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 07:13:25 crc kubenswrapper[4732]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 10 07:13:25 crc kubenswrapper[4732]: > Oct 10 07:13:25 crc kubenswrapper[4732]: E1010 07:13:25.750106 4732 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 10 07:13:25 crc kubenswrapper[4732]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 10 07:13:25 crc kubenswrapper[4732]: + source /usr/local/bin/container-scripts/functions Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNBridge=br-int Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNRemote=tcp:localhost:6642 Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNEncapType=geneve Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNAvailabilityZones= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ EnableChassisAsGateway=true Oct 10 07:13:25 crc kubenswrapper[4732]: ++ PhysicalNetworks= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ OVNHostName= Oct 10 07:13:25 crc kubenswrapper[4732]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 10 07:13:25 crc kubenswrapper[4732]: ++ ovs_dir=/var/lib/openvswitch Oct 10 07:13:25 crc kubenswrapper[4732]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 10 07:13:25 crc kubenswrapper[4732]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 10 07:13:25 crc kubenswrapper[4732]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + sleep 0.5 Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + sleep 0.5 Oct 10 07:13:25 crc kubenswrapper[4732]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 10 07:13:25 crc kubenswrapper[4732]: + cleanup_ovsdb_server_semaphore Oct 10 07:13:25 crc kubenswrapper[4732]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 10 07:13:25 crc kubenswrapper[4732]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 10 07:13:25 crc kubenswrapper[4732]: > pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" containerID="cri-o://0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.750137 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" containerID="cri-o://0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" gracePeriod=29 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.800054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00878-account-delete-pjh75" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.806926 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122832c9-8a6a-48f4-988c-7c4de7dd085a" path="/var/lib/kubelet/pods/122832c9-8a6a-48f4-988c-7c4de7dd085a/volumes" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.807512 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14fd1193-0d3e-4629-86df-7cb03f4b9b33" path="/var/lib/kubelet/pods/14fd1193-0d3e-4629-86df-7cb03f4b9b33/volumes" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.808119 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="898d7ec3-23c2-40a7-b224-cb69ac84e188" path="/var/lib/kubelet/pods/898d7ec3-23c2-40a7-b224-cb69ac84e188/volumes" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.809544 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89ce220-623c-443f-93f4-4a960ffe29eb" path="/var/lib/kubelet/pods/b89ce220-623c-443f-93f4-4a960ffe29eb/volumes" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.810180 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e63d44-6624-462c-9bbd-c6a160083bd0" path="/var/lib/kubelet/pods/d4e63d44-6624-462c-9bbd-c6a160083bd0/volumes" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.810754 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e60af77c-522d-441f-9174-a0242edc0361" path="/var/lib/kubelet/pods/e60af77c-522d-441f-9174-a0242edc0361/volumes" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.811243 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef029637-5b71-415f-8e57-ceec4c813be6" path="/var/lib/kubelet/pods/ef029637-5b71-415f-8e57-ceec4c813be6/volumes" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812757 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-62ae-account-create-tps8w"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812789 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder62ae-account-delete-jd7lb"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812803 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812814 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vn6nz"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812824 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vn6nz"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812834 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xdr4l"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812844 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xdr4l"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812853 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-76d49dbb9c-8g2mb"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812862 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6tnxp"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.812872 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6tnxp"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.813059 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-httpd" containerID="cri-o://f109eab8f1fb8cc71f4902a857174c8040a04484109856debe238368907364e0" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.813240 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-log" containerID="cri-o://b888ac0b6d592ec667eb861df78abe425788617cf262c3af3800b0ec2cf59863" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.813477 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-server" containerID="cri-o://cc76fa90e7b162f0b66e824eee5ff268aceed1434ce758c6900d6e7104073f19" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.813662 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-httpd" containerID="cri-o://d76c61903df35f0f8176003951cca022e8081f2049617d8550fff70d06901f35" gracePeriod=30 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.856443 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1a8d0-account-delete-tcwnj" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.873955 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.884060 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0e64-account-create-tsbdp"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.888995 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_64dcf265-8f29-46bc-9b03-40dda51f606b/ovsdbserver-sb/0.log" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.889088 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.896565 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0e64-account-create-tsbdp"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.907063 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement0e64-account-delete-6ncrf"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.935373 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-v7wvk"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.948885 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="565f831c-0da8-4481-8461-8522e0cfa801" containerName="rabbitmq" containerID="cri-o://f7527cba13db589dd756b76500f8bbf94063e072e3428b0e50a4da46b0e63723" gracePeriod=604800 Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.954799 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-v7wvk"] Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.958471 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Oct 10 07:13:25 crc kubenswrapper[4732]: I1010 07:13:25.996819 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d0a1-account-create-4wdrt"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.007759 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d0a1-account-create-4wdrt"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.028793 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicand0a1-account-delete-5xtdn"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.031040 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db70-account-create-lwrb5"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.044655 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db70-account-create-lwrb5"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.060205 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ps9wp"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.073757 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qnzwq_0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8/openstack-network-exporter/0.log" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.073827 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.090638 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.090721 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-config\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.090743 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-scripts\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.090827 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-combined-ca-bundle\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.090856 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-metrics-certs-tls-certs\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.090893 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdbserver-sb-tls-certs\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.090989 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdb-rundir\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.091039 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rttrw\" (UniqueName: \"kubernetes.io/projected/64dcf265-8f29-46bc-9b03-40dda51f606b-kube-api-access-rttrw\") pod \"64dcf265-8f29-46bc-9b03-40dda51f606b\" (UID: \"64dcf265-8f29-46bc-9b03-40dda51f606b\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.095438 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ps9wp"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.098852 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.099385 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.101218 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-config" (OuterVolumeSpecName: "config") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.101808 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-scripts" (OuterVolumeSpecName: "scripts") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.105972 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64dcf265-8f29-46bc-9b03-40dda51f606b-kube-api-access-rttrw" (OuterVolumeSpecName: "kube-api-access-rttrw") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "kube-api-access-rttrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.106039 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancedb70-account-delete-6srhp"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.111806 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e3b5-account-create-fcdjv"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.119617 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-mvmsb"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.125424 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e3b5-account-create-fcdjv"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.131387 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-mvmsb"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.181604 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrone3b5-account-delete-dtqm9"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.205333 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovn-rundir\") pod \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.205398 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovs-rundir\") pod \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.205443 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-config\") pod \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.205536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-metrics-certs-tls-certs\") pod \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.205615 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-combined-ca-bundle\") pod \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.205675 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5bbl\" (UniqueName: \"kubernetes.io/projected/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-kube-api-access-m5bbl\") pod \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\" (UID: \"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.206088 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.206107 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rttrw\" (UniqueName: \"kubernetes.io/projected/64dcf265-8f29-46bc-9b03-40dda51f606b-kube-api-access-rttrw\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.206128 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.206139 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.206149 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64dcf265-8f29-46bc-9b03-40dda51f606b-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.206109 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" (UID: "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.206454 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" (UID: "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.207121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-config" (OuterVolumeSpecName: "config") pod "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" (UID: "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.212133 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-59899c8879-prgpj"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.212353 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-59899c8879-prgpj" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker-log" containerID="cri-o://a99618b8dab7c28ba86268863ff2d9ff67fee28ff3451930b3093109a33ec4fa" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.212800 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-59899c8879-prgpj" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker" containerID="cri-o://ef9057491152a5767b996f9aa867ebd6dd43e2419bbb3e1338a179c119d1dd11" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.212827 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-kube-api-access-m5bbl" (OuterVolumeSpecName: "kube-api-access-m5bbl") pod "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" (UID: "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8"). InnerVolumeSpecName "kube-api-access-m5bbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.218903 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.244702 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-69886dc6f8-sfv6d"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.244975 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener-log" containerID="cri-o://0865053222fa0b5007b970a0a688d80dffb2106bd8ecb37be061b0d8aaf978cd" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.245523 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener" containerID="cri-o://3854a4792dd684bc5e205f322924b756bc947446fb50e6b4d2c49c8df807513b" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.254227 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7949456448-wncp2"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.254588 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7949456448-wncp2" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api-log" containerID="cri-o://37340908e0e97c3a729ef1965ebc6960980d4f8695b1143d55d26a38eea03ce7" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.255910 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7949456448-wncp2" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api" containerID="cri-o://d77b0880ffd05c296384bd3f19b5b8b3ab8e7f54824859cab96f74dddf1fd9e2" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.257118 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.257339 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-log" containerID="cri-o://101a8d6951b18acd6a4c085a8be7084960161610e3440bd4359884be73a60369" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.257465 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-api" containerID="cri-o://521cf68574ce4f3c728de951f4d7a2e5c5c7da7a4c60aac8f61aad22cdb5008d" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.261873 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.272856 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.292983 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.293602 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-log" containerID="cri-o://aaf10e61456ff1882f86011615c89d1ec3d16648bdf71900a294e95ad885a047" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.294040 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-metadata" containerID="cri-o://28679c08d1706b7a047ce63e0dbc74864b4cc97372d8f26b15d33225fb8a8912" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.294919 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" (UID: "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308819 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308864 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308874 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308882 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5bbl\" (UniqueName: \"kubernetes.io/projected/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-kube-api-access-m5bbl\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308892 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308902 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308928 4732 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.308936 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.311359 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.320989 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-94k8g"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.334015 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a8d0-account-create-fx6nt"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.344626 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-94k8g"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.347541 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1a8d0-account-delete-tcwnj"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.367166 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a8d0-account-create-fx6nt"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.376753 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c29q6"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.378187 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerID="7c102b472c047348ed7dff4aff9894c0cde366c8c678aa7233770836081af19e" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.378211 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785547cb47-x77nc" event={"ID":"eb94a64c-1a0c-4a61-bb69-e843b627cf35","Type":"ContainerDied","Data":"7c102b472c047348ed7dff4aff9894c0cde366c8c678aa7233770836081af19e"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.387441 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-c29q6"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.399479 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.400037 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="8d80b654-a26e-46ea-84f4-264c3c883250" containerName="nova-cell1-conductor-conductor" containerID="cri-o://da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.429595 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vf62z"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.437070 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439461 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="7c44462876be789a8e5caeabb0625c49ae5413ec6663dae73e6b157a5e977d76" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439491 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="c875be356f22174bd7fe912809d07ce631dcb17edd6d1d6aabc340d517cc6551" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439498 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="7740e27fefba27d5e80df5ff662cfd5fc4b86c96b608fa32c24f8d2b25cee4a2" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439505 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="e2917273d26b808e5a8fc08c8152f588e5014472d4e7a647ebdbecedcba84fda" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439512 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="84783f363dda1053c7f032969b8a9b632ff711d6f0764371f0d881ee3ad20516" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439518 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="d4e49bf9ad485c0fe0bbb4a2dbc2f08f31e1f3158c54e7e7a0fa81f3f0046870" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439524 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="43a213f53856bce5c190f44e7458e042262da79e1784f2045dda4e75dc3471b6" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439531 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="bcadb525584dba5a9a1af302bfd2be19ff703a8c744b55dbf166f43746dfd5fa" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439538 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="ca80afd8ea95c25e8f07db4e28d154c1d53a72ce3f36789ae3ff9af29cf3a561" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439544 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="4577435a75c7166b15559759271a9948adb5a88482a2db26d6c48d48b9208d39" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439551 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="306e993d7c42ab68be7b6186fb51f97b059b5a7bcc1a130f1b6cecbe5bae570f" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439557 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="04a8993decfa9c602d76b910a4eb75f9a4b7db875ca6bfa209f16244327dbd1a" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439563 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="440271d54f3b94f368b668f0086f762ecb8f963317d10585e119ad50bf50d796" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439569 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="aea23c87d7a0e1648589bbcd40543c7d5e8ccf5a80b3a896677fc3b317ec2dda" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439595 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"7c44462876be789a8e5caeabb0625c49ae5413ec6663dae73e6b157a5e977d76"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439665 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"c875be356f22174bd7fe912809d07ce631dcb17edd6d1d6aabc340d517cc6551"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439682 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"7740e27fefba27d5e80df5ff662cfd5fc4b86c96b608fa32c24f8d2b25cee4a2"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439704 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"e2917273d26b808e5a8fc08c8152f588e5014472d4e7a647ebdbecedcba84fda"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439717 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"84783f363dda1053c7f032969b8a9b632ff711d6f0764371f0d881ee3ad20516"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439725 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"d4e49bf9ad485c0fe0bbb4a2dbc2f08f31e1f3158c54e7e7a0fa81f3f0046870"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439734 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"43a213f53856bce5c190f44e7458e042262da79e1784f2045dda4e75dc3471b6"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439742 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"bcadb525584dba5a9a1af302bfd2be19ff703a8c744b55dbf166f43746dfd5fa"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439751 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"ca80afd8ea95c25e8f07db4e28d154c1d53a72ce3f36789ae3ff9af29cf3a561"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439760 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"4577435a75c7166b15559759271a9948adb5a88482a2db26d6c48d48b9208d39"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439769 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"306e993d7c42ab68be7b6186fb51f97b059b5a7bcc1a130f1b6cecbe5bae570f"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"04a8993decfa9c602d76b910a4eb75f9a4b7db875ca6bfa209f16244327dbd1a"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439786 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"440271d54f3b94f368b668f0086f762ecb8f963317d10585e119ad50bf50d796"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.439795 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"aea23c87d7a0e1648589bbcd40543c7d5e8ccf5a80b3a896677fc3b317ec2dda"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.449415 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "64dcf265-8f29-46bc-9b03-40dda51f606b" (UID: "64dcf265-8f29-46bc-9b03-40dda51f606b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.454712 4732 generic.go:334] "Generic (PLEG): container finished" podID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerID="f109eab8f1fb8cc71f4902a857174c8040a04484109856debe238368907364e0" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.454769 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" event={"ID":"7daaf3e5-82f0-45f7-aa22-40be65433320","Type":"ContainerDied","Data":"f109eab8f1fb8cc71f4902a857174c8040a04484109856debe238368907364e0"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.461779 4732 generic.go:334] "Generic (PLEG): container finished" podID="3e37998e-491a-43b8-abda-4bdfea233217" containerID="9e5587596a7f6545f5ee41c7fe004abf66a409e2bfb223d64c2066b916dae202" exitCode=143 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.461870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d445dfc98-wk5w4" event={"ID":"3e37998e-491a-43b8-abda-4bdfea233217","Type":"ContainerDied","Data":"9e5587596a7f6545f5ee41c7fe004abf66a409e2bfb223d64c2066b916dae202"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.476803 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vf62z"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.480057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" (UID: "0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.481030 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.483267 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a570b39e-7329-4bba-bfe0-cf5f7aa2269e","Type":"ContainerDied","Data":"58d341eb205d877224a5cb6a46597c2de42b015fa39f99b361d6a4d67ded1cc4"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.484350 4732 generic.go:334] "Generic (PLEG): container finished" podID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerID="58d341eb205d877224a5cb6a46597c2de42b015fa39f99b361d6a4d67ded1cc4" exitCode=143 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.485229 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.485436 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="13bb7b78-cc62-4d3b-a33a-9af77ee9e141" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.491153 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.491378 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="710f9fa6-588e-4226-a65d-5220d0a1f315" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f9506fceaa77699397e7b29b0e67d5a568de582fd92174a2332250afa9eed955" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.503747 4732 generic.go:334] "Generic (PLEG): container finished" podID="c58c2bae-9347-4644-ae19-ff3781571610" containerID="1df0bfc677e23cf3848c6b955400cf7ad114080934e2ff619ab58f33fe07c595" exitCode=137 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.503993 4732 scope.go:117] "RemoveContainer" containerID="1df0bfc677e23cf3848c6b955400cf7ad114080934e2ff619ab58f33fe07c595" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.504563 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.506127 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.506335 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="53c7a322-6bdd-4613-9a25-39391becbb81" containerName="nova-scheduler-scheduler" containerID="cri-o://33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.510921 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerName="galera" containerID="cri-o://61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298" gracePeriod=30 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.517060 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.517087 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/64dcf265-8f29-46bc-9b03-40dda51f606b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.524437 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_64dcf265-8f29-46bc-9b03-40dda51f606b/ovsdbserver-sb/0.log" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.524556 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.524561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"64dcf265-8f29-46bc-9b03-40dda51f606b","Type":"ContainerDied","Data":"921fc0c1c0b4fd25990384b47a4dc6619c07acc8751b61a5220b329348a3c52a"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.532937 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-qnzwq_0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8/openstack-network-exporter/0.log" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.533066 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-qnzwq" event={"ID":"0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8","Type":"ContainerDied","Data":"1b12750afa68cee82cea406fff26411748a1b05ec3a9a01d365a12c3b3a9f74e"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.533084 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-qnzwq" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.536131 4732 generic.go:334] "Generic (PLEG): container finished" podID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerID="b888ac0b6d592ec667eb861df78abe425788617cf262c3af3800b0ec2cf59863" exitCode=143 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.536189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2","Type":"ContainerDied","Data":"b888ac0b6d592ec667eb861df78abe425788617cf262c3af3800b0ec2cf59863"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.538655 4732 generic.go:334] "Generic (PLEG): container finished" podID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerID="335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.538727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"27b69405-bc4b-4e39-be49-0a966bc649bb","Type":"ContainerDied","Data":"335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.552464 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0fa844d-f411-49a9-a52f-256760a71157" containerID="a99618b8dab7c28ba86268863ff2d9ff67fee28ff3451930b3093109a33ec4fa" exitCode=143 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.552530 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59899c8879-prgpj" event={"ID":"d0fa844d-f411-49a9-a52f-256760a71157","Type":"ContainerDied","Data":"a99618b8dab7c28ba86268863ff2d9ff67fee28ff3451930b3093109a33ec4fa"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.565274 4732 generic.go:334] "Generic (PLEG): container finished" podID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerID="6a5d33aa67dcba33d9cca4c7fdc0763242994866d812f7cf3aeebaaa727f6f35" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.565340 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" event={"ID":"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5","Type":"ContainerDied","Data":"6a5d33aa67dcba33d9cca4c7fdc0763242994866d812f7cf3aeebaaa727f6f35"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.565415 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b59764b5c-95h4h" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.604318 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.612254 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.612788 4732 generic.go:334] "Generic (PLEG): container finished" podID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" exitCode=0 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.612833 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9v88" event={"ID":"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030","Type":"ContainerDied","Data":"0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.615653 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ac6dedf8-3428-4444-86f9-4f25c0b916e3/ovsdbserver-nb/0.log" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.615679 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerID="3368687e64b8c0c613b949dc899a5ce3a9150e48d0028af9fccd8eb195d75f5b" exitCode=2 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.615706 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerID="ed3b3ed20134b6ded19f5781525993fc3a4c2c1ec91fdf5fb5344ce971381d40" exitCode=143 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.615720 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac6dedf8-3428-4444-86f9-4f25c0b916e3","Type":"ContainerDied","Data":"3368687e64b8c0c613b949dc899a5ce3a9150e48d0028af9fccd8eb195d75f5b"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.615735 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac6dedf8-3428-4444-86f9-4f25c0b916e3","Type":"ContainerDied","Data":"ed3b3ed20134b6ded19f5781525993fc3a4c2c1ec91fdf5fb5344ce971381d40"} Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618072 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-svc\") pod \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618161 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8f8l\" (UniqueName: \"kubernetes.io/projected/c58c2bae-9347-4644-ae19-ff3781571610-kube-api-access-h8f8l\") pod \"c58c2bae-9347-4644-ae19-ff3781571610\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618185 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config-secret\") pod \"c58c2bae-9347-4644-ae19-ff3781571610\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618210 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-nb\") pod \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618244 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-swift-storage-0\") pod \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618263 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-combined-ca-bundle\") pod \"c58c2bae-9347-4644-ae19-ff3781571610\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cbtv\" (UniqueName: \"kubernetes.io/projected/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-kube-api-access-2cbtv\") pod \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618359 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config\") pod \"c58c2bae-9347-4644-ae19-ff3781571610\" (UID: \"c58c2bae-9347-4644-ae19-ff3781571610\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618417 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-sb\") pod \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.618435 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-config\") pod \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\" (UID: \"1c76de27-f32a-47ec-ba19-1b8e7a5e6be5\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.621706 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-qnzwq"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.627219 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58c2bae-9347-4644-ae19-ff3781571610-kube-api-access-h8f8l" (OuterVolumeSpecName: "kube-api-access-h8f8l") pod "c58c2bae-9347-4644-ae19-ff3781571610" (UID: "c58c2bae-9347-4644-ae19-ff3781571610"). InnerVolumeSpecName "kube-api-access-h8f8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.639281 4732 scope.go:117] "RemoveContainer" containerID="695c8e21da8fa77374077cd2cc05c6d275fab9cd31581217ca2977b370adcc19" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.648246 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-qnzwq"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.650017 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-kube-api-access-2cbtv" (OuterVolumeSpecName: "kube-api-access-2cbtv") pod "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" (UID: "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5"). InnerVolumeSpecName "kube-api-access-2cbtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.678604 4732 scope.go:117] "RemoveContainer" containerID="9babba80be3c1a6fd055e84387fb3c74f74af8c92bd4f83ebb53cf7d2b84b84d" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.680831 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ac6dedf8-3428-4444-86f9-4f25c0b916e3/ovsdbserver-nb/0.log" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.680952 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.716934 4732 scope.go:117] "RemoveContainer" containerID="6a693cf87726dd07aef0243f81c2ca77c5d4545a90e8f3f043f685eaf87b6af5" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.720454 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8f8l\" (UniqueName: \"kubernetes.io/projected/c58c2bae-9347-4644-ae19-ff3781571610-kube-api-access-h8f8l\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.720491 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cbtv\" (UniqueName: \"kubernetes.io/projected/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-kube-api-access-2cbtv\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.746884 4732 scope.go:117] "RemoveContainer" containerID="6a5d33aa67dcba33d9cca4c7fdc0763242994866d812f7cf3aeebaaa727f6f35" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.771191 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c58c2bae-9347-4644-ae19-ff3781571610" (UID: "c58c2bae-9347-4644-ae19-ff3781571610"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.775985 4732 scope.go:117] "RemoveContainer" containerID="731da02a1e6751e1d7213579ef56876796ccc65a87796ddba50224b4c70cdce5" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.776170 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c58c2bae-9347-4644-ae19-ff3781571610" (UID: "c58c2bae-9347-4644-ae19-ff3781571610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.792366 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-config" (OuterVolumeSpecName: "config") pod "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" (UID: "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.802473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" (UID: "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.804053 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" (UID: "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.808910 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c58c2bae-9347-4644-ae19-ff3781571610" (UID: "c58c2bae-9347-4644-ae19-ff3781571610"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.811735 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" (UID: "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.818211 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" (UID: "1c76de27-f32a-47ec-ba19-1b8e7a5e6be5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.821634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.821769 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-combined-ca-bundle\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.821798 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-scripts\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.821916 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.821950 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdb-rundir\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.821976 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-metrics-certs-tls-certs\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822000 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnjfb\" (UniqueName: \"kubernetes.io/projected/ac6dedf8-3428-4444-86f9-4f25c0b916e3-kube-api-access-wnjfb\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822066 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-config\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822432 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822447 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822456 4732 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822465 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58c2bae-9347-4644-ae19-ff3781571610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822475 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c58c2bae-9347-4644-ae19-ff3781571610-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822483 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822493 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.822504 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.823038 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-config" (OuterVolumeSpecName: "config") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.823819 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-scripts" (OuterVolumeSpecName: "scripts") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.824545 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.826049 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.827274 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6dedf8-3428-4444-86f9-4f25c0b916e3-kube-api-access-wnjfb" (OuterVolumeSpecName: "kube-api-access-wnjfb") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "kube-api-access-wnjfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: E1010 07:13:26.860263 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 07:13:26 crc kubenswrapper[4732]: E1010 07:13:26.867801 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 07:13:26 crc kubenswrapper[4732]: E1010 07:13:26.878551 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.878606 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: E1010 07:13:26.878640 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="ovn-northd" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.878679 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="710f9fa6-588e-4226-a65d-5220d0a1f315" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.195:6080/vnc_lite.html\": dial tcp 10.217.0.195:6080: connect: connection refused" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.920750 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement0e64-account-delete-6ncrf"] Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.929625 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.929653 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.929681 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.929706 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.929722 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnjfb\" (UniqueName: \"kubernetes.io/projected/ac6dedf8-3428-4444-86f9-4f25c0b916e3-kube-api-access-wnjfb\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.929731 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac6dedf8-3428-4444-86f9-4f25c0b916e3-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.942382 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder62ae-account-delete-jd7lb"] Oct 10 07:13:26 crc kubenswrapper[4732]: W1010 07:13:26.951911 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ea62a47_1d15_41a2_a0d0_a0456a46183a.slice/crio-58067ed69b9b8f07a74796cd8788caa414ad014ed38081803ee90cab44508785 WatchSource:0}: Error finding container 58067ed69b9b8f07a74796cd8788caa414ad014ed38081803ee90cab44508785: Status 404 returned error can't find the container with id 58067ed69b9b8f07a74796cd8788caa414ad014ed38081803ee90cab44508785 Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.989141 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:26 crc kubenswrapper[4732]: I1010 07:13:26.993499 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b59764b5c-95h4h"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.042842 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.058359 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.074065 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs\") pod \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\" (UID: \"ac6dedf8-3428-4444-86f9-4f25c0b916e3\") " Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.075598 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.075620 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:27 crc kubenswrapper[4732]: W1010 07:13:27.075726 4732 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ac6dedf8-3428-4444-86f9-4f25c0b916e3/volumes/kubernetes.io~secret/ovsdbserver-nb-tls-certs Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.077240 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "ac6dedf8-3428-4444-86f9-4f25c0b916e3" (UID: "ac6dedf8-3428-4444-86f9-4f25c0b916e3"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.122594 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b59764b5c-95h4h"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.151588 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicand0a1-account-delete-5xtdn"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.160426 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrone3b5-account-delete-dtqm9"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.173017 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancedb70-account-delete-6srhp"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.177988 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac6dedf8-3428-4444-86f9-4f25c0b916e3-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.280153 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.280239 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data podName:88a11668-5ab6-4b77-8bb7-ac60140f4bd4 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:31.280219062 +0000 UTC m=+1338.349810303 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data") pod "rabbitmq-cell1-server-0" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4") : configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:27 crc kubenswrapper[4732]: W1010 07:13:27.442576 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56a05e0a_f30f_4b7c_b939_eba8d0094d48.slice/crio-830bd467873fa8b3a42ce8ae4d7d2c0fc08f60e2cf63a026d600c0a94aabca4e WatchSource:0}: Error finding container 830bd467873fa8b3a42ce8ae4d7d2c0fc08f60e2cf63a026d600c0a94aabca4e: Status 404 returned error can't find the container with id 830bd467873fa8b3a42ce8ae4d7d2c0fc08f60e2cf63a026d600c0a94aabca4e Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.445712 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi0b57-account-delete-95cjw"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.480832 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell00878-account-delete-pjh75"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.494464 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1a8d0-account-delete-tcwnj"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.648945 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1a8d0-account-delete-tcwnj" event={"ID":"56a05e0a-f30f-4b7c-b939-eba8d0094d48","Type":"ContainerStarted","Data":"830bd467873fa8b3a42ce8ae4d7d2c0fc08f60e2cf63a026d600c0a94aabca4e"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.652784 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0fa844d-f411-49a9-a52f-256760a71157" containerID="ef9057491152a5767b996f9aa867ebd6dd43e2419bbb3e1338a179c119d1dd11" exitCode=0 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.652846 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59899c8879-prgpj" event={"ID":"d0fa844d-f411-49a9-a52f-256760a71157","Type":"ContainerDied","Data":"ef9057491152a5767b996f9aa867ebd6dd43e2419bbb3e1338a179c119d1dd11"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.654083 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi0b57-account-delete-95cjw" event={"ID":"c76de706-34bc-4b37-8492-3573c19e91c2","Type":"ContainerStarted","Data":"e85918946a4cd07b24a7309533bb5dce5941629dbd2282efb0cae6d4d9dae54f"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.655670 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00878-account-delete-pjh75" event={"ID":"1f7ba305-07fd-408c-865f-463e3738e6cb","Type":"ContainerStarted","Data":"e6e45d2c8e155eccba8eac7001fad86d0afbc66d286de286a6da54351d909af9"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.657088 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ac6dedf8-3428-4444-86f9-4f25c0b916e3/ovsdbserver-nb/0.log" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.657138 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ac6dedf8-3428-4444-86f9-4f25c0b916e3","Type":"ContainerDied","Data":"ed32738bd923676863c786b74fb577e6ed95cefb0ac7a429559d826e3a76d685"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.657160 4732 scope.go:117] "RemoveContainer" containerID="3368687e64b8c0c613b949dc899a5ce3a9150e48d0028af9fccd8eb195d75f5b" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.657290 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.678620 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ba01b5-953d-4178-b73c-5b5f13268e13" path="/var/lib/kubelet/pods/06ba01b5-953d-4178-b73c-5b5f13268e13/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.679175 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" path="/var/lib/kubelet/pods/0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.679773 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" path="/var/lib/kubelet/pods/1c76de27-f32a-47ec-ba19-1b8e7a5e6be5/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.686950 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f43028-56f9-42d6-ad26-631d79465b65" path="/var/lib/kubelet/pods/20f43028-56f9-42d6-ad26-631d79465b65/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.688076 4732 generic.go:334] "Generic (PLEG): container finished" podID="710f9fa6-588e-4226-a65d-5220d0a1f315" containerID="f9506fceaa77699397e7b29b0e67d5a568de582fd92174a2332250afa9eed955" exitCode=0 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.688158 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e5a89ea-9d74-48e6-8255-62ebd3feaa52" path="/var/lib/kubelet/pods/2e5a89ea-9d74-48e6-8255-62ebd3feaa52/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.688790 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" path="/var/lib/kubelet/pods/64dcf265-8f29-46bc-9b03-40dda51f606b/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.690249 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d361b6b-d6cf-44c8-ba94-5cbba8dae55e" path="/var/lib/kubelet/pods/6d361b6b-d6cf-44c8-ba94-5cbba8dae55e/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.690806 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d185c7-30e1-4efc-acd8-5ce5b3784a47" path="/var/lib/kubelet/pods/95d185c7-30e1-4efc-acd8-5ce5b3784a47/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.691297 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5a1acc-920d-437b-b952-80e1cb9fc587" path="/var/lib/kubelet/pods/ba5a1acc-920d-437b-b952-80e1cb9fc587/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.692143 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58c2bae-9347-4644-ae19-ff3781571610" path="/var/lib/kubelet/pods/c58c2bae-9347-4644-ae19-ff3781571610/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.693080 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85cd845-7899-4892-be21-259881ff6ed5" path="/var/lib/kubelet/pods/c85cd845-7899-4892-be21-259881ff6ed5/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.693721 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce79fc9b-55ca-4b99-adef-500f3cf92f81" path="/var/lib/kubelet/pods/ce79fc9b-55ca-4b99-adef-500f3cf92f81/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.694348 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf4e373d-3210-42f5-9ec0-506c454718d2" path="/var/lib/kubelet/pods/cf4e373d-3210-42f5-9ec0-506c454718d2/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.694905 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.694968 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data podName:565f831c-0da8-4481-8461-8522e0cfa801 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:31.694954215 +0000 UTC m=+1338.764545456 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data") pod "rabbitmq-server-0" (UID: "565f831c-0da8-4481-8461-8522e0cfa801") : configmap "rabbitmq-config-data" not found Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.695336 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6" path="/var/lib/kubelet/pods/e6ebbbc6-d85c-4bbb-a8a1-c58b24bc40a6/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.695807 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9928b55-49e9-4091-95de-77a8c1a01318" path="/var/lib/kubelet/pods/e9928b55-49e9-4091-95de-77a8c1a01318/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.696258 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca127ce-2fe2-49bd-94f2-d772fcffb2d5" path="/var/lib/kubelet/pods/eca127ce-2fe2-49bd-94f2-d772fcffb2d5/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.696705 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee98afa3-91b1-4d45-9bf8-e3659b14be63" path="/var/lib/kubelet/pods/ee98afa3-91b1-4d45-9bf8-e3659b14be63/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.697506 4732 generic.go:334] "Generic (PLEG): container finished" podID="8d80b654-a26e-46ea-84f4-264c3c883250" containerID="da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8" exitCode=0 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.697882 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eec93206-5a99-4edd-a303-0d8dee1658dc" path="/var/lib/kubelet/pods/eec93206-5a99-4edd-a303-0d8dee1658dc/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.698350 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6767f74-220c-4299-ad0a-a12dcc2d7e24" path="/var/lib/kubelet/pods/f6767f74-220c-4299-ad0a-a12dcc2d7e24/volumes" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.699029 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancedb70-account-delete-6srhp" event={"ID":"03efe727-1f84-49e0-b6cb-a7189a02ba76","Type":"ContainerStarted","Data":"f9d07bdbcdc231e9a236d5fc89c80e8824bf1161f878aa426bf13bbebfb4a25f"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.699054 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder62ae-account-delete-jd7lb" event={"ID":"80309f7c-d137-4116-a447-c9749c27c669","Type":"ContainerStarted","Data":"427cac54f3478cf3880b320d223ca4b0983088f7b7f2630092e6d598ae35eca6"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.699067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"710f9fa6-588e-4226-a65d-5220d0a1f315","Type":"ContainerDied","Data":"f9506fceaa77699397e7b29b0e67d5a568de582fd92174a2332250afa9eed955"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.699079 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"710f9fa6-588e-4226-a65d-5220d0a1f315","Type":"ContainerDied","Data":"276b8a0dba902f244256d6f181af88820089bdaf2659e8b864712ff6faa77dbd"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.699088 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="276b8a0dba902f244256d6f181af88820089bdaf2659e8b864712ff6faa77dbd" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.699096 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d80b654-a26e-46ea-84f4-264c3c883250","Type":"ContainerDied","Data":"da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.705583 4732 generic.go:334] "Generic (PLEG): container finished" podID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerID="37340908e0e97c3a729ef1965ebc6960980d4f8695b1143d55d26a38eea03ce7" exitCode=143 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.705655 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7949456448-wncp2" event={"ID":"2960d902-25b0-4fb8-baa7-fe7f9d4f5811","Type":"ContainerDied","Data":"37340908e0e97c3a729ef1965ebc6960980d4f8695b1143d55d26a38eea03ce7"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.708185 4732 generic.go:334] "Generic (PLEG): container finished" podID="56077f87-ea67-4080-b328-7186a7d0bf35" containerID="101a8d6951b18acd6a4c085a8be7084960161610e3440bd4359884be73a60369" exitCode=143 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.708263 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56077f87-ea67-4080-b328-7186a7d0bf35","Type":"ContainerDied","Data":"101a8d6951b18acd6a4c085a8be7084960161610e3440bd4359884be73a60369"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.714866 4732 generic.go:334] "Generic (PLEG): container finished" podID="4ea62a47-1d15-41a2-a0d0-a0456a46183a" containerID="1efcb6805df6c3ebeab4e1ae74758ec0a924658e669b9a8aed99ea90a2dcebc7" exitCode=0 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.714946 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0e64-account-delete-6ncrf" event={"ID":"4ea62a47-1d15-41a2-a0d0-a0456a46183a","Type":"ContainerDied","Data":"1efcb6805df6c3ebeab4e1ae74758ec0a924658e669b9a8aed99ea90a2dcebc7"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.714971 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0e64-account-delete-6ncrf" event={"ID":"4ea62a47-1d15-41a2-a0d0-a0456a46183a","Type":"ContainerStarted","Data":"58067ed69b9b8f07a74796cd8788caa414ad014ed38081803ee90cab44508785"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.739827 4732 generic.go:334] "Generic (PLEG): container finished" podID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerID="61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298" exitCode=0 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.739894 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"63706a24-ebfd-45ae-96b0-49ab7bd13fdf","Type":"ContainerDied","Data":"61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.758950 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrone3b5-account-delete-dtqm9" event={"ID":"5c751e0c-75c7-4aaf-bf32-55e6d022d802","Type":"ContainerStarted","Data":"71fc0b1cd366e38052793e0669137aa6d7ece8fc603eecabfae54a5b05f0a322"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.766331 4732 generic.go:334] "Generic (PLEG): container finished" podID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerID="aaf10e61456ff1882f86011615c89d1ec3d16648bdf71900a294e95ad885a047" exitCode=143 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.766378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f34ab2c-f804-4f24-a447-165d5afb984f","Type":"ContainerDied","Data":"aaf10e61456ff1882f86011615c89d1ec3d16648bdf71900a294e95ad885a047"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.788800 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicand0a1-account-delete-5xtdn" event={"ID":"cb766f51-b132-4979-b32e-a2cfcb3edb50","Type":"ContainerStarted","Data":"dc6c82fe6ffb2f5e067bd499ba53169442c73bc301cb9f5eae59512ac1b0a5d1"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.815053 4732 generic.go:334] "Generic (PLEG): container finished" podID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerID="3854a4792dd684bc5e205f322924b756bc947446fb50e6b4d2c49c8df807513b" exitCode=0 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.815096 4732 generic.go:334] "Generic (PLEG): container finished" podID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerID="0865053222fa0b5007b970a0a688d80dffb2106bd8ecb37be061b0d8aaf978cd" exitCode=143 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.815176 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" event={"ID":"f5d96c35-c01e-4f12-ab12-7b6342789b2f","Type":"ContainerDied","Data":"3854a4792dd684bc5e205f322924b756bc947446fb50e6b4d2c49c8df807513b"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.815206 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" event={"ID":"f5d96c35-c01e-4f12-ab12-7b6342789b2f","Type":"ContainerDied","Data":"0865053222fa0b5007b970a0a688d80dffb2106bd8ecb37be061b0d8aaf978cd"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.832453 4732 generic.go:334] "Generic (PLEG): container finished" podID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerID="cc76fa90e7b162f0b66e824eee5ff268aceed1434ce758c6900d6e7104073f19" exitCode=0 Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.832795 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" event={"ID":"7daaf3e5-82f0-45f7-aa22-40be65433320","Type":"ContainerDied","Data":"cc76fa90e7b162f0b66e824eee5ff268aceed1434ce758c6900d6e7104073f19"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.832849 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" event={"ID":"7daaf3e5-82f0-45f7-aa22-40be65433320","Type":"ContainerDied","Data":"5874141244f3843be5731f1bb9474d7151f98861d4d7fcf7bccd405260d27426"} Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.832867 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5874141244f3843be5731f1bb9474d7151f98861d4d7fcf7bccd405260d27426" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.839277 4732 scope.go:117] "RemoveContainer" containerID="ed3b3ed20134b6ded19f5781525993fc3a4c2c1ec91fdf5fb5344ce971381d40" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.874125 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.888643 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.905346 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 07:13:27 crc kubenswrapper[4732]: I1010 07:13:27.943680 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.996207 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.997826 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.999631 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:13:27 crc kubenswrapper[4732]: E1010 07:13:27.999659 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="53c7a322-6bdd-4613-9a25-39391becbb81" containerName="nova-scheduler-scheduler" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.005881 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsq5c\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-kube-api-access-zsq5c\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.005956 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-run-httpd\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.005977 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-internal-tls-certs\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.005999 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-combined-ca-bundle\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006013 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-config-data\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006044 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfflx\" (UniqueName: \"kubernetes.io/projected/710f9fa6-588e-4226-a65d-5220d0a1f315-kube-api-access-lfflx\") pod \"710f9fa6-588e-4226-a65d-5220d0a1f315\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-public-tls-certs\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006154 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-combined-ca-bundle\") pod \"710f9fa6-588e-4226-a65d-5220d0a1f315\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006177 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-config-data\") pod \"710f9fa6-588e-4226-a65d-5220d0a1f315\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006208 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-vencrypt-tls-certs\") pod \"710f9fa6-588e-4226-a65d-5220d0a1f315\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006240 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-log-httpd\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006259 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-etc-swift\") pod \"7daaf3e5-82f0-45f7-aa22-40be65433320\" (UID: \"7daaf3e5-82f0-45f7-aa22-40be65433320\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.006299 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-nova-novncproxy-tls-certs\") pod \"710f9fa6-588e-4226-a65d-5220d0a1f315\" (UID: \"710f9fa6-588e-4226-a65d-5220d0a1f315\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.008725 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.008858 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.017045 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710f9fa6-588e-4226-a65d-5220d0a1f315-kube-api-access-lfflx" (OuterVolumeSpecName: "kube-api-access-lfflx") pod "710f9fa6-588e-4226-a65d-5220d0a1f315" (UID: "710f9fa6-588e-4226-a65d-5220d0a1f315"). InnerVolumeSpecName "kube-api-access-lfflx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.017956 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.020000 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-kube-api-access-zsq5c" (OuterVolumeSpecName: "kube-api-access-zsq5c") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "kube-api-access-zsq5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.067485 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.067868 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-central-agent" containerID="cri-o://03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.068366 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="proxy-httpd" containerID="cri-o://f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.068427 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="sg-core" containerID="cri-o://7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.068475 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-notification-agent" containerID="cri-o://d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.111320 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.111350 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.111361 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsq5c\" (UniqueName: \"kubernetes.io/projected/7daaf3e5-82f0-45f7-aa22-40be65433320-kube-api-access-zsq5c\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.111372 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7daaf3e5-82f0-45f7-aa22-40be65433320-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.111380 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfflx\" (UniqueName: \"kubernetes.io/projected/710f9fa6-588e-4226-a65d-5220d0a1f315-kube-api-access-lfflx\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.125490 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-config-data" (OuterVolumeSpecName: "config-data") pod "710f9fa6-588e-4226-a65d-5220d0a1f315" (UID: "710f9fa6-588e-4226-a65d-5220d0a1f315"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.146618 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "710f9fa6-588e-4226-a65d-5220d0a1f315" (UID: "710f9fa6-588e-4226-a65d-5220d0a1f315"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.151180 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.151372 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" containerName="kube-state-metrics" containerID="cri-o://90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.217142 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.217417 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.223092 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.231605 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.231903 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="ab930cd4-caad-4980-a491-8f6c5abca8bf" containerName="memcached" containerID="cri-o://9ccbb101b60a4c5d1f4f9801ccb65f5ad73e384c6a69193236f1dbf249839c94" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.258492 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.263568 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298 is running failed: container process not found" containerID="61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.266342 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298 is running failed: container process not found" containerID="61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.266684 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298 is running failed: container process not found" containerID="61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.266772 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerName="galera" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.312021 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-p2tz8"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.322571 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.332342 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.329619 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-p2tz8"] Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.335449 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8 is running failed: container process not found" containerID="da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.335477 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8c7d5b696-rkhkz"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.335716 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-8c7d5b696-rkhkz" podUID="b93e689a-691a-403b-970f-63547469bbfe" containerName="keystone-api" containerID="cri-o://e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.335940 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8 is running failed: container process not found" containerID="da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.336428 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8 is running failed: container process not found" containerID="da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.336475 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="8d80b654-a26e-46ea-84f4-264c3c883250" containerName="nova-cell1-conductor-conductor" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.356846 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": read tcp 10.217.0.2:53324->10.217.0.163:8776: read: connection reset by peer" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.357113 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mz64q"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.369390 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mz64q"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.376770 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.393452 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vhn47"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.405579 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vhn47"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.414225 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-653d-account-create-9hqh8"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.418826 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-653d-account-create-9hqh8"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.441239 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-config-data" (OuterVolumeSpecName: "config-data") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.489493 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "710f9fa6-588e-4226-a65d-5220d0a1f315" (UID: "710f9fa6-588e-4226-a65d-5220d0a1f315"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.532654 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7daaf3e5-82f0-45f7-aa22-40be65433320" (UID: "7daaf3e5-82f0-45f7-aa22-40be65433320"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.545979 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "710f9fa6-588e-4226-a65d-5220d0a1f315" (UID: "710f9fa6-588e-4226-a65d-5220d0a1f315"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.551950 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.551975 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7daaf3e5-82f0-45f7-aa22-40be65433320-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.551983 4732 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.551994 4732 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/710f9fa6-588e-4226-a65d-5220d0a1f315-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.677590 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerName="galera" containerID="cri-o://eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8" gracePeriod=30 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.731823 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.742243 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.748640 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.766091 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.811857 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement0e64-account-delete-6ncrf" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.812743 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.826927 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.849130 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb179b69_8c25_49b1_88b5_6c17953ffbcd.slice/crio-conmon-f4d100a6f0ecddf29e4d78c67e5019b69856d21ffbb0b74a4b0262657c6c0304.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded592ee3_6dab_41d4_8141_bb7c31b02f73.slice/crio-conmon-f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded592ee3_6dab_41d4_8141_bb7c31b02f73.slice/crio-conmon-03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded592ee3_6dab_41d4_8141_bb7c31b02f73.slice/crio-conmon-7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded592ee3_6dab_41d4_8141_bb7c31b02f73.slice/crio-f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded592ee3_6dab_41d4_8141_bb7c31b02f73.slice/crio-7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb179b69_8c25_49b1_88b5_6c17953ffbcd.slice/crio-f4d100a6f0ecddf29e4d78c67e5019b69856d21ffbb0b74a4b0262657c6c0304.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.887205 4732 generic.go:334] "Generic (PLEG): container finished" podID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerID="698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686" exitCode=0 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.887295 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"27b69405-bc4b-4e39-be49-0a966bc649bb","Type":"ContainerDied","Data":"698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.887324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"27b69405-bc4b-4e39-be49-0a966bc649bb","Type":"ContainerDied","Data":"33f47b720904e70a98f517a0875c1ad11459a2f94b9d524e5f239ba0d5cb06d7"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.887342 4732 scope.go:117] "RemoveContainer" containerID="335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.887479 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.903705 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-59899c8879-prgpj" event={"ID":"d0fa844d-f411-49a9-a52f-256760a71157","Type":"ContainerDied","Data":"d25edfdabbfd8eb20706eb8c215c54f7b2f5b802298396949e349e2ee7370877"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.905227 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-59899c8879-prgpj" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906109 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data\") pod \"d0fa844d-f411-49a9-a52f-256760a71157\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906135 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0fa844d-f411-49a9-a52f-256760a71157-logs\") pod \"d0fa844d-f411-49a9-a52f-256760a71157\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906155 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-secrets\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906178 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data-custom\") pod \"d0fa844d-f411-49a9-a52f-256760a71157\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-config-data\") pod \"8d80b654-a26e-46ea-84f4-264c3c883250\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906237 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906265 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-generated\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906279 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5d96c35-c01e-4f12-ab12-7b6342789b2f-logs\") pod \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906298 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nlbd\" (UniqueName: \"kubernetes.io/projected/f5d96c35-c01e-4f12-ab12-7b6342789b2f-kube-api-access-7nlbd\") pod \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906317 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-combined-ca-bundle\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906362 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-operator-scripts\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906379 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-galera-tls-certs\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906401 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlcgg\" (UniqueName: \"kubernetes.io/projected/d0fa844d-f411-49a9-a52f-256760a71157-kube-api-access-hlcgg\") pod \"d0fa844d-f411-49a9-a52f-256760a71157\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906424 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-combined-ca-bundle\") pod \"8d80b654-a26e-46ea-84f4-264c3c883250\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906453 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-combined-ca-bundle\") pod \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906469 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data-custom\") pod \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906501 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84slz\" (UniqueName: \"kubernetes.io/projected/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kube-api-access-84slz\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906520 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-default\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906575 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-combined-ca-bundle\") pod \"d0fa844d-f411-49a9-a52f-256760a71157\" (UID: \"d0fa844d-f411-49a9-a52f-256760a71157\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906596 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kolla-config\") pod \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\" (UID: \"63706a24-ebfd-45ae-96b0-49ab7bd13fdf\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906641 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rwpx\" (UniqueName: \"kubernetes.io/projected/8d80b654-a26e-46ea-84f4-264c3c883250-kube-api-access-5rwpx\") pod \"8d80b654-a26e-46ea-84f4-264c3c883250\" (UID: \"8d80b654-a26e-46ea-84f4-264c3c883250\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.906662 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data\") pod \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\" (UID: \"f5d96c35-c01e-4f12-ab12-7b6342789b2f\") " Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.908430 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.912092 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0fa844d-f411-49a9-a52f-256760a71157-logs" (OuterVolumeSpecName: "logs") pod "d0fa844d-f411-49a9-a52f-256760a71157" (UID: "d0fa844d-f411-49a9-a52f-256760a71157"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.914556 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.915213 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5d96c35-c01e-4f12-ab12-7b6342789b2f-logs" (OuterVolumeSpecName: "logs") pod "f5d96c35-c01e-4f12-ab12-7b6342789b2f" (UID: "f5d96c35-c01e-4f12-ab12-7b6342789b2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.915576 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.918293 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.918524 4732 generic.go:334] "Generic (PLEG): container finished" podID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerID="f4d100a6f0ecddf29e4d78c67e5019b69856d21ffbb0b74a4b0262657c6c0304" exitCode=0 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.918574 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb179b69-8c25-49b1-88b5-6c17953ffbcd","Type":"ContainerDied","Data":"f4d100a6f0ecddf29e4d78c67e5019b69856d21ffbb0b74a4b0262657c6c0304"} Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.920858 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.921046 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kube-api-access-84slz" (OuterVolumeSpecName: "kube-api-access-84slz") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "kube-api-access-84slz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.924427 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8d80b654-a26e-46ea-84f4-264c3c883250","Type":"ContainerDied","Data":"63a8a10c8dccd10c83706e956446a412ad4cc2b1be0170c40baf3133c175aad6"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.929536 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d445dfc98-wk5w4" event={"ID":"3e37998e-491a-43b8-abda-4bdfea233217","Type":"ContainerDied","Data":"ba7a1f03f18ae86234997ab8ec3532045109f6ac2550d2fdf25633eb2d62be0a"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.927153 4732 generic.go:334] "Generic (PLEG): container finished" podID="3e37998e-491a-43b8-abda-4bdfea233217" containerID="ba7a1f03f18ae86234997ab8ec3532045109f6ac2550d2fdf25633eb2d62be0a" exitCode=0 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.924486 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.931902 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 07:13:28 crc kubenswrapper[4732]: E1010 07:13:28.931996 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="13bb7b78-cc62-4d3b-a33a-9af77ee9e141" containerName="nova-cell0-conductor-conductor" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.932619 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.939024 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f5d96c35-c01e-4f12-ab12-7b6342789b2f" (UID: "f5d96c35-c01e-4f12-ab12-7b6342789b2f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.939155 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5d96c35-c01e-4f12-ab12-7b6342789b2f-kube-api-access-7nlbd" (OuterVolumeSpecName: "kube-api-access-7nlbd") pod "f5d96c35-c01e-4f12-ab12-7b6342789b2f" (UID: "f5d96c35-c01e-4f12-ab12-7b6342789b2f"). InnerVolumeSpecName "kube-api-access-7nlbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.943673 4732 scope.go:117] "RemoveContainer" containerID="698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.946517 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" event={"ID":"f5d96c35-c01e-4f12-ab12-7b6342789b2f","Type":"ContainerDied","Data":"5c84d8a53ef2654754a32aa5cd3d813906d216569388acec620ead3a4b4da84e"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.946604 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-69886dc6f8-sfv6d" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.948764 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-secrets" (OuterVolumeSpecName: "secrets") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.948781 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d80b654-a26e-46ea-84f4-264c3c883250-kube-api-access-5rwpx" (OuterVolumeSpecName: "kube-api-access-5rwpx") pod "8d80b654-a26e-46ea-84f4-264c3c883250" (UID: "8d80b654-a26e-46ea-84f4-264c3c883250"). InnerVolumeSpecName "kube-api-access-5rwpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.949235 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d0fa844d-f411-49a9-a52f-256760a71157" (UID: "d0fa844d-f411-49a9-a52f-256760a71157"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.950169 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0fa844d-f411-49a9-a52f-256760a71157-kube-api-access-hlcgg" (OuterVolumeSpecName: "kube-api-access-hlcgg") pod "d0fa844d-f411-49a9-a52f-256760a71157" (UID: "d0fa844d-f411-49a9-a52f-256760a71157"). InnerVolumeSpecName "kube-api-access-hlcgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.959858 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.967415 4732 generic.go:334] "Generic (PLEG): container finished" podID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerID="f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1" exitCode=0 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.967439 4732 generic.go:334] "Generic (PLEG): container finished" podID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerID="7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd" exitCode=2 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.967446 4732 generic.go:334] "Generic (PLEG): container finished" podID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerID="03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12" exitCode=0 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.967484 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerDied","Data":"f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.967508 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerDied","Data":"7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.967517 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerDied","Data":"03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.971710 4732 generic.go:334] "Generic (PLEG): container finished" podID="d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" containerID="90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592" exitCode=2 Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.971748 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85","Type":"ContainerDied","Data":"90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.971765 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85","Type":"ContainerDied","Data":"efce9cddac093e68f8121d0f1a22775f2c1cbc31b6718bc684d6abfec9b531f5"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.971813 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.973394 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0e64-account-delete-6ncrf" event={"ID":"4ea62a47-1d15-41a2-a0d0-a0456a46183a","Type":"ContainerDied","Data":"58067ed69b9b8f07a74796cd8788caa414ad014ed38081803ee90cab44508785"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.973450 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement0e64-account-delete-6ncrf" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.989270 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.990315 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.990657 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"63706a24-ebfd-45ae-96b0-49ab7bd13fdf","Type":"ContainerDied","Data":"a09ab7179a6af806a1367e461c78cdfe29d6a8b06c2347053b2e5cecb6477e47"} Oct 10 07:13:28 crc kubenswrapper[4732]: I1010 07:13:28.990788 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76d49dbb9c-8g2mb" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.018758 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data\") pod \"27b69405-bc4b-4e39-be49-0a966bc649bb\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.018809 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/27b69405-bc4b-4e39-be49-0a966bc649bb-etc-machine-id\") pod \"27b69405-bc4b-4e39-be49-0a966bc649bb\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.018902 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx95j\" (UniqueName: \"kubernetes.io/projected/4ea62a47-1d15-41a2-a0d0-a0456a46183a-kube-api-access-kx95j\") pod \"4ea62a47-1d15-41a2-a0d0-a0456a46183a\" (UID: \"4ea62a47-1d15-41a2-a0d0-a0456a46183a\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020246 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-combined-ca-bundle\") pod \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020295 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzsw6\" (UniqueName: \"kubernetes.io/projected/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-api-access-dzsw6\") pod \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020430 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-config\") pod \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020514 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-certs\") pod \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\" (UID: \"d4e6ee2c-4b3b-48af-860d-f23aea3c4c85\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020549 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-combined-ca-bundle\") pod \"27b69405-bc4b-4e39-be49-0a966bc649bb\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020571 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data-custom\") pod \"27b69405-bc4b-4e39-be49-0a966bc649bb\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020588 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gppq\" (UniqueName: \"kubernetes.io/projected/27b69405-bc4b-4e39-be49-0a966bc649bb-kube-api-access-7gppq\") pod \"27b69405-bc4b-4e39-be49-0a966bc649bb\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.020614 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-scripts\") pod \"27b69405-bc4b-4e39-be49-0a966bc649bb\" (UID: \"27b69405-bc4b-4e39-be49-0a966bc649bb\") " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.021125 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlcgg\" (UniqueName: \"kubernetes.io/projected/d0fa844d-f411-49a9-a52f-256760a71157-kube-api-access-hlcgg\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.021146 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.021156 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84slz\" (UniqueName: \"kubernetes.io/projected/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kube-api-access-84slz\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.021166 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.021175 4732 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.021184 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rwpx\" (UniqueName: \"kubernetes.io/projected/8d80b654-a26e-46ea-84f4-264c3c883250-kube-api-access-5rwpx\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.021192 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0fa844d-f411-49a9-a52f-256760a71157-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.022065 4732 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.022077 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.022100 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.022111 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.022119 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5d96c35-c01e-4f12-ab12-7b6342789b2f-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.022128 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nlbd\" (UniqueName: \"kubernetes.io/projected/f5d96c35-c01e-4f12-ab12-7b6342789b2f-kube-api-access-7nlbd\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.022137 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.025010 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27b69405-bc4b-4e39-be49-0a966bc649bb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "27b69405-bc4b-4e39-be49-0a966bc649bb" (UID: "27b69405-bc4b-4e39-be49-0a966bc649bb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.043759 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-api-access-dzsw6" (OuterVolumeSpecName: "kube-api-access-dzsw6") pod "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" (UID: "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85"). InnerVolumeSpecName "kube-api-access-dzsw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.044065 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b69405-bc4b-4e39-be49-0a966bc649bb-kube-api-access-7gppq" (OuterVolumeSpecName: "kube-api-access-7gppq") pod "27b69405-bc4b-4e39-be49-0a966bc649bb" (UID: "27b69405-bc4b-4e39-be49-0a966bc649bb"). InnerVolumeSpecName "kube-api-access-7gppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.044008 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea62a47-1d15-41a2-a0d0-a0456a46183a-kube-api-access-kx95j" (OuterVolumeSpecName: "kube-api-access-kx95j") pod "4ea62a47-1d15-41a2-a0d0-a0456a46183a" (UID: "4ea62a47-1d15-41a2-a0d0-a0456a46183a"). InnerVolumeSpecName "kube-api-access-kx95j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.053625 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "27b69405-bc4b-4e39-be49-0a966bc649bb" (UID: "27b69405-bc4b-4e39-be49-0a966bc649bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.059089 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-scripts" (OuterVolumeSpecName: "scripts") pod "27b69405-bc4b-4e39-be49-0a966bc649bb" (UID: "27b69405-bc4b-4e39-be49-0a966bc649bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.123426 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.123459 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gppq\" (UniqueName: \"kubernetes.io/projected/27b69405-bc4b-4e39-be49-0a966bc649bb-kube-api-access-7gppq\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.123473 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.123484 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/27b69405-bc4b-4e39-be49-0a966bc649bb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.123499 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx95j\" (UniqueName: \"kubernetes.io/projected/4ea62a47-1d15-41a2-a0d0-a0456a46183a-kube-api-access-kx95j\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.123507 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzsw6\" (UniqueName: \"kubernetes.io/projected/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-api-access-dzsw6\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.203725 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.216957 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" (UID: "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.224845 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.224873 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.246687 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d80b654-a26e-46ea-84f4-264c3c883250" (UID: "8d80b654-a26e-46ea-84f4-264c3c883250"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.255564 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.306660 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5d96c35-c01e-4f12-ab12-7b6342789b2f" (UID: "f5d96c35-c01e-4f12-ab12-7b6342789b2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.326705 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.326737 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.326781 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.385853 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0fa844d-f411-49a9-a52f-256760a71157" (UID: "d0fa844d-f411-49a9-a52f-256760a71157"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.408128 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" (UID: "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.418069 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-config-data" (OuterVolumeSpecName: "config-data") pod "8d80b654-a26e-46ea-84f4-264c3c883250" (UID: "8d80b654-a26e-46ea-84f4-264c3c883250"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.434131 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.434457 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d80b654-a26e-46ea-84f4-264c3c883250-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.434470 4732 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.434584 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7949456448-wncp2" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:60704->10.217.0.156:9311: read: connection reset by peer" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.434967 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7949456448-wncp2" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:60702->10.217.0.156:9311: read: connection reset by peer" Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.456529 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.456550 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data" (OuterVolumeSpecName: "config-data") pod "f5d96c35-c01e-4f12-ab12-7b6342789b2f" (UID: "f5d96c35-c01e-4f12-ab12-7b6342789b2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.456638 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" (UID: "d4e6ee2c-4b3b-48af-860d-f23aea3c4c85"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.462106 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "63706a24-ebfd-45ae-96b0-49ab7bd13fdf" (UID: "63706a24-ebfd-45ae-96b0-49ab7bd13fdf"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.463998 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.464037 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data" (OuterVolumeSpecName: "config-data") pod "d0fa844d-f411-49a9-a52f-256760a71157" (UID: "d0fa844d-f411-49a9-a52f-256760a71157"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.469819 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.470221 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.470261 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.474584 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.475756 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:29 crc kubenswrapper[4732]: E1010 07:13:29.475798 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.483935 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27b69405-bc4b-4e39-be49-0a966bc649bb" (UID: "27b69405-bc4b-4e39-be49-0a966bc649bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.515013 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lzkzk" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" probeResult="failure" output=< Oct 10 07:13:29 crc kubenswrapper[4732]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Oct 10 07:13:29 crc kubenswrapper[4732]: > Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.535817 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5d96c35-c01e-4f12-ab12-7b6342789b2f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.535847 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0fa844d-f411-49a9-a52f-256760a71157-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.535858 4732 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/63706a24-ebfd-45ae-96b0-49ab7bd13fdf-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.535868 4732 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.535878 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.579625 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data" (OuterVolumeSpecName: "config-data") pod "27b69405-bc4b-4e39-be49-0a966bc649bb" (UID: "27b69405-bc4b-4e39-be49-0a966bc649bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.636712 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27b69405-bc4b-4e39-be49-0a966bc649bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.686219 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cc47b2-b0ca-4234-b5da-2779e1210367" path="/var/lib/kubelet/pods/55cc47b2-b0ca-4234-b5da-2779e1210367/volumes" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.686803 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801dc083-4a38-4af1-9bf1-b40a3c204e09" path="/var/lib/kubelet/pods/801dc083-4a38-4af1-9bf1-b40a3c204e09/volumes" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.687249 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9521a409-ec60-4dfb-b864-6fe4156581bf" path="/var/lib/kubelet/pods/9521a409-ec60-4dfb-b864-6fe4156581bf/volumes" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.688372 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" path="/var/lib/kubelet/pods/ac6dedf8-3428-4444-86f9-4f25c0b916e3/volumes" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.688972 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2578031-2533-4d9f-b953-0452e05e88e8" path="/var/lib/kubelet/pods/b2578031-2533-4d9f-b953-0452e05e88e8/volumes" Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.801951 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pjznz"] Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.828628 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pjznz"] Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.835064 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0b57-account-create-glphj"] Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.843810 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi0b57-account-delete-95cjw"] Oct 10 07:13:29 crc kubenswrapper[4732]: I1010 07:13:29.869040 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0b57-account-create-glphj"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.006560 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": dial tcp 10.217.0.197:3000: connect: connection refused" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.016472 4732 generic.go:334] "Generic (PLEG): container finished" podID="ab930cd4-caad-4980-a491-8f6c5abca8bf" containerID="9ccbb101b60a4c5d1f4f9801ccb65f5ad73e384c6a69193236f1dbf249839c94" exitCode=0 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.016553 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ab930cd4-caad-4980-a491-8f6c5abca8bf","Type":"ContainerDied","Data":"9ccbb101b60a4c5d1f4f9801ccb65f5ad73e384c6a69193236f1dbf249839c94"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.016583 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ab930cd4-caad-4980-a491-8f6c5abca8bf","Type":"ContainerDied","Data":"cfdff48c902851a94ff0a491ee578227d5d7ef62d96b874a9d5baa4d09a7e74f"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.016598 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfdff48c902851a94ff0a491ee578227d5d7ef62d96b874a9d5baa4d09a7e74f" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.021731 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5kfkd"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.022149 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder62ae-account-delete-jd7lb" event={"ID":"80309f7c-d137-4116-a447-c9749c27c669","Type":"ContainerStarted","Data":"188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.022284 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder62ae-account-delete-jd7lb" podUID="80309f7c-d137-4116-a447-c9749c27c669" containerName="mariadb-account-delete" containerID="cri-o://188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63" gracePeriod=30 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.027192 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi0b57-account-delete-95cjw" event={"ID":"c76de706-34bc-4b37-8492-3573c19e91c2","Type":"ContainerStarted","Data":"abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.027590 4732 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapi0b57-account-delete-95cjw" secret="" err="secret \"galera-openstack-dockercfg-6rmlz\" not found" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.029638 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5kfkd"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.033066 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicand0a1-account-delete-5xtdn" event={"ID":"cb766f51-b132-4979-b32e-a2cfcb3edb50","Type":"ContainerStarted","Data":"2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.033173 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbicand0a1-account-delete-5xtdn" podUID="cb766f51-b132-4979-b32e-a2cfcb3edb50" containerName="mariadb-account-delete" containerID="cri-o://2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a" gracePeriod=30 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.042996 4732 generic.go:334] "Generic (PLEG): container finished" podID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerID="d77b0880ffd05c296384bd3f19b5b8b3ab8e7f54824859cab96f74dddf1fd9e2" exitCode=0 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.043055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7949456448-wncp2" event={"ID":"2960d902-25b0-4fb8-baa7-fe7f9d4f5811","Type":"ContainerDied","Data":"d77b0880ffd05c296384bd3f19b5b8b3ab8e7f54824859cab96f74dddf1fd9e2"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.045395 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d445dfc98-wk5w4" event={"ID":"3e37998e-491a-43b8-abda-4bdfea233217","Type":"ContainerDied","Data":"645c9cd4a1dd0b79ccd8c0bbbd6ad0325c5870e48176f2d49707cdd64e3a78e3"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.045417 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="645c9cd4a1dd0b79ccd8c0bbbd6ad0325c5870e48176f2d49707cdd64e3a78e3" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.047559 4732 generic.go:334] "Generic (PLEG): container finished" podID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerID="d76c61903df35f0f8176003951cca022e8081f2049617d8550fff70d06901f35" exitCode=0 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.047582 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2","Type":"ContainerDied","Data":"d76c61903df35f0f8176003951cca022e8081f2049617d8550fff70d06901f35"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.047632 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2","Type":"ContainerDied","Data":"b74aa03cba2028b41de341da82af3ddd941b4c035fe7204eeb523e579a423b20"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.047644 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b74aa03cba2028b41de341da82af3ddd941b4c035fe7204eeb523e579a423b20" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.051915 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell00878-account-delete-pjh75"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.053025 4732 generic.go:334] "Generic (PLEG): container finished" podID="56a05e0a-f30f-4b7c-b939-eba8d0094d48" containerID="d6f0336dc5707aabe42a9ded989244034dbfa84b62509eb392ec6856fd475828" exitCode=1 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.053338 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1a8d0-account-delete-tcwnj" event={"ID":"56a05e0a-f30f-4b7c-b939-eba8d0094d48","Type":"ContainerDied","Data":"d6f0336dc5707aabe42a9ded989244034dbfa84b62509eb392ec6856fd475828"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.056232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb179b69-8c25-49b1-88b5-6c17953ffbcd","Type":"ContainerDied","Data":"552f92891e8da3df71277c717d79fbdb0d0cb1e1b3f98cc5f356bc509ce33025"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.056262 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552f92891e8da3df71277c717d79fbdb0d0cb1e1b3f98cc5f356bc509ce33025" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.059340 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-0878-account-create-hdbsh"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.067124 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-0878-account-create-hdbsh"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.067977 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder62ae-account-delete-jd7lb" podStartSLOduration=7.067955234 podStartE2EDuration="7.067955234s" podCreationTimestamp="2025-10-10 07:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:13:30.037965677 +0000 UTC m=+1337.107556918" watchObservedRunningTime="2025-10-10 07:13:30.067955234 +0000 UTC m=+1337.137546475" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.068011 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"13bb7b78-cc62-4d3b-a33a-9af77ee9e141","Type":"ContainerDied","Data":"f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.067995 4732 generic.go:334] "Generic (PLEG): container finished" podID="13bb7b78-cc62-4d3b-a33a-9af77ee9e141" containerID="f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c" exitCode=0 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.076968 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbicand0a1-account-delete-5xtdn" podStartSLOduration=6.076945759 podStartE2EDuration="6.076945759s" podCreationTimestamp="2025-10-10 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:13:30.050781176 +0000 UTC m=+1337.120372437" watchObservedRunningTime="2025-10-10 07:13:30.076945759 +0000 UTC m=+1337.146537000" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.084038 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi0b57-account-delete-95cjw" podStartSLOduration=6.084017902 podStartE2EDuration="6.084017902s" podCreationTimestamp="2025-10-10 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:13:30.060911082 +0000 UTC m=+1337.130502333" watchObservedRunningTime="2025-10-10 07:13:30.084017902 +0000 UTC m=+1337.153609143" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.095919 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrone3b5-account-delete-dtqm9" event={"ID":"5c751e0c-75c7-4aaf-bf32-55e6d022d802","Type":"ContainerStarted","Data":"cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.095979 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutrone3b5-account-delete-dtqm9" podUID="5c751e0c-75c7-4aaf-bf32-55e6d022d802" containerName="mariadb-account-delete" containerID="cri-o://cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508" gracePeriod=30 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.101594 4732 generic.go:334] "Generic (PLEG): container finished" podID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerID="28679c08d1706b7a047ce63e0dbc74864b4cc97372d8f26b15d33225fb8a8912" exitCode=0 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.101644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f34ab2c-f804-4f24-a447-165d5afb984f","Type":"ContainerDied","Data":"28679c08d1706b7a047ce63e0dbc74864b4cc97372d8f26b15d33225fb8a8912"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.104007 4732 generic.go:334] "Generic (PLEG): container finished" podID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerID="f366ccf0fd7eff9163283eb01f40b778944fbee5750e2fdcbc35a6bd70d5f9a8" exitCode=0 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.104042 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a570b39e-7329-4bba-bfe0-cf5f7aa2269e","Type":"ContainerDied","Data":"f366ccf0fd7eff9163283eb01f40b778944fbee5750e2fdcbc35a6bd70d5f9a8"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.104058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a570b39e-7329-4bba-bfe0-cf5f7aa2269e","Type":"ContainerDied","Data":"b8c9aa31e222078d4192908fa0e5d027440473db942e4d92c412271b5d5647e4"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.104068 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8c9aa31e222078d4192908fa0e5d027440473db942e4d92c412271b5d5647e4" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.119207 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00878-account-delete-pjh75" event={"ID":"1f7ba305-07fd-408c-865f-463e3738e6cb","Type":"ContainerStarted","Data":"bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.119701 4732 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell00878-account-delete-pjh75" secret="" err="secret \"galera-openstack-dockercfg-6rmlz\" not found" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.137815 4732 generic.go:334] "Generic (PLEG): container finished" podID="56077f87-ea67-4080-b328-7186a7d0bf35" containerID="521cf68574ce4f3c728de951f4d7a2e5c5c7da7a4c60aac8f61aad22cdb5008d" exitCode=0 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.137888 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56077f87-ea67-4080-b328-7186a7d0bf35","Type":"ContainerDied","Data":"521cf68574ce4f3c728de951f4d7a2e5c5c7da7a4c60aac8f61aad22cdb5008d"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.142957 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutrone3b5-account-delete-dtqm9" podStartSLOduration=6.142936348 podStartE2EDuration="6.142936348s" podCreationTimestamp="2025-10-10 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:13:30.116196369 +0000 UTC m=+1337.185787610" watchObservedRunningTime="2025-10-10 07:13:30.142936348 +0000 UTC m=+1337.212527589" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.150553 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell00878-account-delete-pjh75" podStartSLOduration=6.150538045 podStartE2EDuration="6.150538045s" podCreationTimestamp="2025-10-10 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:13:30.138062465 +0000 UTC m=+1337.207653706" watchObservedRunningTime="2025-10-10 07:13:30.150538045 +0000 UTC m=+1337.220129286" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.153832 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancedb70-account-delete-6srhp" event={"ID":"03efe727-1f84-49e0-b6cb-a7189a02ba76","Type":"ContainerStarted","Data":"d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57"} Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.153869 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glancedb70-account-delete-6srhp" podUID="03efe727-1f84-49e0-b6cb-a7189a02ba76" containerName="mariadb-account-delete" containerID="cri-o://d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57" gracePeriod=30 Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.174954 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glancedb70-account-delete-6srhp" podStartSLOduration=6.17494154 podStartE2EDuration="6.17494154s" podCreationTimestamp="2025-10-10 07:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 07:13:30.169533223 +0000 UTC m=+1337.239124464" watchObservedRunningTime="2025-10-10 07:13:30.17494154 +0000 UTC m=+1337.244532781" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.478372 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.488075 4732 scope.go:117] "RemoveContainer" containerID="335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d" Oct 10 07:13:30 crc kubenswrapper[4732]: E1010 07:13:30.488807 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d\": container with ID starting with 335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d not found: ID does not exist" containerID="335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.488845 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d"} err="failed to get container status \"335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d\": rpc error: code = NotFound desc = could not find container \"335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d\": container with ID starting with 335ad6b06d8f051b716f55bc026720ece2be3d4372bec23ab53f7b378940ec5d not found: ID does not exist" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.488874 4732 scope.go:117] "RemoveContainer" containerID="698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686" Oct 10 07:13:30 crc kubenswrapper[4732]: E1010 07:13:30.489151 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686\": container with ID starting with 698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686 not found: ID does not exist" containerID="698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.489175 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686"} err="failed to get container status \"698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686\": rpc error: code = NotFound desc = could not find container \"698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686\": container with ID starting with 698040aef33fb3136b3d55721010525a4b75b250e35d5f0abe5881570364c686 not found: ID does not exist" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.489189 4732 scope.go:117] "RemoveContainer" containerID="ef9057491152a5767b996f9aa867ebd6dd43e2419bbb3e1338a179c119d1dd11" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.492510 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.520440 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.530432 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.533616 4732 scope.go:117] "RemoveContainer" containerID="a99618b8dab7c28ba86268863ff2d9ff67fee28ff3451930b3093109a33ec4fa" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.546699 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-memcached-tls-certs\") pod \"ab930cd4-caad-4980-a491-8f6c5abca8bf\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555505 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-config-data\") pod \"3e37998e-491a-43b8-abda-4bdfea233217\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555528 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555573 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-combined-ca-bundle\") pod \"ab930cd4-caad-4980-a491-8f6c5abca8bf\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555607 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-internal-tls-certs\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555629 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e37998e-491a-43b8-abda-4bdfea233217-logs\") pod \"3e37998e-491a-43b8-abda-4bdfea233217\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555647 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb179b69-8c25-49b1-88b5-6c17953ffbcd-etc-machine-id\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555667 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-scripts\") pod \"3e37998e-491a-43b8-abda-4bdfea233217\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555748 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data-custom\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555775 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-scripts\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555799 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-kolla-config\") pod \"ab930cd4-caad-4980-a491-8f6c5abca8bf\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555821 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-config-data\") pod \"ab930cd4-caad-4980-a491-8f6c5abca8bf\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555844 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws4nq\" (UniqueName: \"kubernetes.io/projected/cb179b69-8c25-49b1-88b5-6c17953ffbcd-kube-api-access-ws4nq\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555865 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-logs\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555898 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m6mx\" (UniqueName: \"kubernetes.io/projected/ab930cd4-caad-4980-a491-8f6c5abca8bf-kube-api-access-7m6mx\") pod \"ab930cd4-caad-4980-a491-8f6c5abca8bf\" (UID: \"ab930cd4-caad-4980-a491-8f6c5abca8bf\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555919 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-httpd-run\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555942 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-public-tls-certs\") pod \"3e37998e-491a-43b8-abda-4bdfea233217\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555965 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-config-data\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.555989 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556010 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb179b69-8c25-49b1-88b5-6c17953ffbcd-logs\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556033 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-config-data\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556055 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-combined-ca-bundle\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556076 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-httpd-run\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556101 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-public-tls-certs\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556138 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-public-tls-certs\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556162 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-logs\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556191 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78lrr\" (UniqueName: \"kubernetes.io/projected/3e37998e-491a-43b8-abda-4bdfea233217-kube-api-access-78lrr\") pod \"3e37998e-491a-43b8-abda-4bdfea233217\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556216 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-combined-ca-bundle\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556237 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556260 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-internal-tls-certs\") pod \"3e37998e-491a-43b8-abda-4bdfea233217\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556282 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-scripts\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556307 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-combined-ca-bundle\") pod \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\" (UID: \"cb179b69-8c25-49b1-88b5-6c17953ffbcd\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556333 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-internal-tls-certs\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556361 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-scripts\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556383 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c94gt\" (UniqueName: \"kubernetes.io/projected/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-kube-api-access-c94gt\") pod \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\" (UID: \"55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556407 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-986sk\" (UniqueName: \"kubernetes.io/projected/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-kube-api-access-986sk\") pod \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\" (UID: \"a570b39e-7329-4bba-bfe0-cf5f7aa2269e\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.556435 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-combined-ca-bundle\") pod \"3e37998e-491a-43b8-abda-4bdfea233217\" (UID: \"3e37998e-491a-43b8-abda-4bdfea233217\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.564435 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e37998e-491a-43b8-abda-4bdfea233217-logs" (OuterVolumeSpecName: "logs") pod "3e37998e-491a-43b8-abda-4bdfea233217" (UID: "3e37998e-491a-43b8-abda-4bdfea233217"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.564636 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.564801 4732 scope.go:117] "RemoveContainer" containerID="da049d5e742375a0e81025f9ed942f10ce21870d6ef985b08d35d482caee4ce8" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.565050 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.565969 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb179b69-8c25-49b1-88b5-6c17953ffbcd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.576428 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb179b69-8c25-49b1-88b5-6c17953ffbcd-logs" (OuterVolumeSpecName: "logs") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.576450 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.577395 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.580027 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-logs" (OuterVolumeSpecName: "logs") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.580111 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-config-data" (OuterVolumeSpecName: "config-data") pod "ab930cd4-caad-4980-a491-8f6c5abca8bf" (UID: "ab930cd4-caad-4980-a491-8f6c5abca8bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.581855 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "ab930cd4-caad-4980-a491-8f6c5abca8bf" (UID: "ab930cd4-caad-4980-a491-8f6c5abca8bf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.582732 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-logs" (OuterVolumeSpecName: "logs") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.583315 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.584787 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-scripts" (OuterVolumeSpecName: "scripts") pod "3e37998e-491a-43b8-abda-4bdfea233217" (UID: "3e37998e-491a-43b8-abda-4bdfea233217"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.585008 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e37998e-491a-43b8-abda-4bdfea233217-kube-api-access-78lrr" (OuterVolumeSpecName: "kube-api-access-78lrr") pod "3e37998e-491a-43b8-abda-4bdfea233217" (UID: "3e37998e-491a-43b8-abda-4bdfea233217"). InnerVolumeSpecName "kube-api-access-78lrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.585271 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb179b69-8c25-49b1-88b5-6c17953ffbcd-kube-api-access-ws4nq" (OuterVolumeSpecName: "kube-api-access-ws4nq") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "kube-api-access-ws4nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.591417 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-scripts" (OuterVolumeSpecName: "scripts") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.614430 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.615409 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.616850 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab930cd4-caad-4980-a491-8f6c5abca8bf-kube-api-access-7m6mx" (OuterVolumeSpecName: "kube-api-access-7m6mx") pod "ab930cd4-caad-4980-a491-8f6c5abca8bf" (UID: "ab930cd4-caad-4980-a491-8f6c5abca8bf"). InnerVolumeSpecName "kube-api-access-7m6mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.617521 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.618129 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.624126 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1a8d0-account-delete-tcwnj" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.627635 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.630956 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-kube-api-access-986sk" (OuterVolumeSpecName: "kube-api-access-986sk") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "kube-api-access-986sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.630954 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.632357 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-kube-api-access-c94gt" (OuterVolumeSpecName: "kube-api-access-c94gt") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "kube-api-access-c94gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.638284 4732 scope.go:117] "RemoveContainer" containerID="3854a4792dd684bc5e205f322924b756bc947446fb50e6b4d2c49c8df807513b" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.638529 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-59899c8879-prgpj"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.642489 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-scripts" (OuterVolumeSpecName: "scripts") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.652295 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-scripts" (OuterVolumeSpecName: "scripts") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.660745 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-59899c8879-prgpj"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.663817 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.664526 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e37998e-491a-43b8-abda-4bdfea233217-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.664684 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb179b69-8c25-49b1-88b5-6c17953ffbcd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.665005 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.665168 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.665412 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.665573 4732 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.665722 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab930cd4-caad-4980-a491-8f6c5abca8bf-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.665962 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws4nq\" (UniqueName: \"kubernetes.io/projected/cb179b69-8c25-49b1-88b5-6c17953ffbcd-kube-api-access-ws4nq\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.666122 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.666281 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m6mx\" (UniqueName: \"kubernetes.io/projected/ab930cd4-caad-4980-a491-8f6c5abca8bf-kube-api-access-7m6mx\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.666443 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.666607 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.666852 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb179b69-8c25-49b1-88b5-6c17953ffbcd-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.667073 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.667133 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.667457 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78lrr\" (UniqueName: \"kubernetes.io/projected/3e37998e-491a-43b8-abda-4bdfea233217-kube-api-access-78lrr\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.667520 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.667569 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.668035 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c94gt\" (UniqueName: \"kubernetes.io/projected/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-kube-api-access-c94gt\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.668099 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-986sk\" (UniqueName: \"kubernetes.io/projected/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-kube-api-access-986sk\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.671812 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.674829 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.686610 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.715534 4732 scope.go:117] "RemoveContainer" containerID="0865053222fa0b5007b970a0a688d80dffb2106bd8ecb37be061b0d8aaf978cd" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.720596 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.736501 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement0e64-account-delete-6ncrf"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.739139 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab930cd4-caad-4980-a491-8f6c5abca8bf" (UID: "ab930cd4-caad-4980-a491-8f6c5abca8bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.752592 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement0e64-account-delete-6ncrf"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768763 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-public-tls-certs\") pod \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768838 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-combined-ca-bundle\") pod \"8f34ab2c-f804-4f24-a447-165d5afb984f\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768868 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-combined-ca-bundle\") pod \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768889 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-internal-tls-certs\") pod \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768906 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-config-data\") pod \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-combined-ca-bundle\") pod \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768945 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-combined-ca-bundle\") pod \"56077f87-ea67-4080-b328-7186a7d0bf35\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.768972 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4jjq\" (UniqueName: \"kubernetes.io/projected/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-kube-api-access-x4jjq\") pod \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\" (UID: \"13bb7b78-cc62-4d3b-a33a-9af77ee9e141\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-internal-tls-certs\") pod \"56077f87-ea67-4080-b328-7186a7d0bf35\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769019 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f34ab2c-f804-4f24-a447-165d5afb984f-logs\") pod \"8f34ab2c-f804-4f24-a447-165d5afb984f\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769040 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-config-data\") pod \"8f34ab2c-f804-4f24-a447-165d5afb984f\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769324 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data-custom\") pod \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769345 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data\") pod \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769361 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-logs\") pod \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769399 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-config-data\") pod \"56077f87-ea67-4080-b328-7186a7d0bf35\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769443 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56077f87-ea67-4080-b328-7186a7d0bf35-logs\") pod \"56077f87-ea67-4080-b328-7186a7d0bf35\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769480 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zc8m\" (UniqueName: \"kubernetes.io/projected/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-kube-api-access-9zc8m\") pod \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\" (UID: \"2960d902-25b0-4fb8-baa7-fe7f9d4f5811\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769502 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhxxx\" (UniqueName: \"kubernetes.io/projected/56a05e0a-f30f-4b7c-b939-eba8d0094d48-kube-api-access-fhxxx\") pod \"56a05e0a-f30f-4b7c-b939-eba8d0094d48\" (UID: \"56a05e0a-f30f-4b7c-b939-eba8d0094d48\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769541 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-499mx\" (UniqueName: \"kubernetes.io/projected/56077f87-ea67-4080-b328-7186a7d0bf35-kube-api-access-499mx\") pod \"56077f87-ea67-4080-b328-7186a7d0bf35\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769586 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxj5t\" (UniqueName: \"kubernetes.io/projected/8f34ab2c-f804-4f24-a447-165d5afb984f-kube-api-access-kxj5t\") pod \"8f34ab2c-f804-4f24-a447-165d5afb984f\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769604 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-nova-metadata-tls-certs\") pod \"8f34ab2c-f804-4f24-a447-165d5afb984f\" (UID: \"8f34ab2c-f804-4f24-a447-165d5afb984f\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769643 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-public-tls-certs\") pod \"56077f87-ea67-4080-b328-7186a7d0bf35\" (UID: \"56077f87-ea67-4080-b328-7186a7d0bf35\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.769996 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.771643 4732 scope.go:117] "RemoveContainer" containerID="90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.772725 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56077f87-ea67-4080-b328-7186a7d0bf35-logs" (OuterVolumeSpecName: "logs") pod "56077f87-ea67-4080-b328-7186a7d0bf35" (UID: "56077f87-ea67-4080-b328-7186a7d0bf35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.779267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-logs" (OuterVolumeSpecName: "logs") pod "2960d902-25b0-4fb8-baa7-fe7f9d4f5811" (UID: "2960d902-25b0-4fb8-baa7-fe7f9d4f5811"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.781957 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f34ab2c-f804-4f24-a447-165d5afb984f-logs" (OuterVolumeSpecName: "logs") pod "8f34ab2c-f804-4f24-a447-165d5afb984f" (UID: "8f34ab2c-f804-4f24-a447-165d5afb984f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.796120 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-69886dc6f8-sfv6d"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.803438 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-69886dc6f8-sfv6d"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.808924 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a05e0a-f30f-4b7c-b939-eba8d0094d48-kube-api-access-fhxxx" (OuterVolumeSpecName: "kube-api-access-fhxxx") pod "56a05e0a-f30f-4b7c-b939-eba8d0094d48" (UID: "56a05e0a-f30f-4b7c-b939-eba8d0094d48"). InnerVolumeSpecName "kube-api-access-fhxxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.808988 4732 scope.go:117] "RemoveContainer" containerID="90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592" Oct 10 07:13:30 crc kubenswrapper[4732]: E1010 07:13:30.809498 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592\": container with ID starting with 90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592 not found: ID does not exist" containerID="90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.809535 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592"} err="failed to get container status \"90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592\": rpc error: code = NotFound desc = could not find container \"90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592\": container with ID starting with 90bc33249577a314e0f6ce33e6b280f5e7655d47d6eb5ebea3d7fc7b6159a592 not found: ID does not exist" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.809563 4732 scope.go:117] "RemoveContainer" containerID="1efcb6805df6c3ebeab4e1ae74758ec0a924658e669b9a8aed99ea90a2dcebc7" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.811742 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56077f87-ea67-4080-b328-7186a7d0bf35-kube-api-access-499mx" (OuterVolumeSpecName: "kube-api-access-499mx") pod "56077f87-ea67-4080-b328-7186a7d0bf35" (UID: "56077f87-ea67-4080-b328-7186a7d0bf35"). InnerVolumeSpecName "kube-api-access-499mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.811750 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.811930 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-kube-api-access-9zc8m" (OuterVolumeSpecName: "kube-api-access-9zc8m") pod "2960d902-25b0-4fb8-baa7-fe7f9d4f5811" (UID: "2960d902-25b0-4fb8-baa7-fe7f9d4f5811"). InnerVolumeSpecName "kube-api-access-9zc8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.812243 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2960d902-25b0-4fb8-baa7-fe7f9d4f5811" (UID: "2960d902-25b0-4fb8-baa7-fe7f9d4f5811"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.824193 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.824629 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-kube-api-access-x4jjq" (OuterVolumeSpecName: "kube-api-access-x4jjq") pod "13bb7b78-cc62-4d3b-a33a-9af77ee9e141" (UID: "13bb7b78-cc62-4d3b-a33a-9af77ee9e141"). InnerVolumeSpecName "kube-api-access-x4jjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.826947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f34ab2c-f804-4f24-a447-165d5afb984f-kube-api-access-kxj5t" (OuterVolumeSpecName: "kube-api-access-kxj5t") pod "8f34ab2c-f804-4f24-a447-165d5afb984f" (UID: "8f34ab2c-f804-4f24-a447-165d5afb984f"). InnerVolumeSpecName "kube-api-access-kxj5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.831322 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.831618 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.831665 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-76d49dbb9c-8g2mb"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.832325 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.841165 4732 scope.go:117] "RemoveContainer" containerID="61cf37eb60eab6d1c263f86dc1b196e204b1d7dd19e3a716700473b2f136d298" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.846956 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876525 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-galera-tls-certs\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876603 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs8xt\" (UniqueName: \"kubernetes.io/projected/dbaa5798-1d07-445a-a226-ad48054d3dbc-kube-api-access-xs8xt\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876651 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-secrets\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876736 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-operator-scripts\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876831 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-combined-ca-bundle\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876885 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-kolla-config\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876908 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-generated\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876932 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-default\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.876988 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"dbaa5798-1d07-445a-a226-ad48054d3dbc\" (UID: \"dbaa5798-1d07-445a-a226-ad48054d3dbc\") " Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877293 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877312 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4jjq\" (UniqueName: \"kubernetes.io/projected/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-kube-api-access-x4jjq\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877321 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f34ab2c-f804-4f24-a447-165d5afb984f-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877330 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877338 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877347 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877355 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56077f87-ea67-4080-b328-7186a7d0bf35-logs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877364 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zc8m\" (UniqueName: \"kubernetes.io/projected/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-kube-api-access-9zc8m\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877375 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhxxx\" (UniqueName: \"kubernetes.io/projected/56a05e0a-f30f-4b7c-b939-eba8d0094d48-kube-api-access-fhxxx\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877385 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-499mx\" (UniqueName: \"kubernetes.io/projected/56077f87-ea67-4080-b328-7186a7d0bf35-kube-api-access-499mx\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877395 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxj5t\" (UniqueName: \"kubernetes.io/projected/8f34ab2c-f804-4f24-a447-165d5afb984f-kube-api-access-kxj5t\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.877404 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.879057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.879569 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.879730 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.879792 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.882471 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbaa5798-1d07-445a-a226-ad48054d3dbc-kube-api-access-xs8xt" (OuterVolumeSpecName: "kube-api-access-xs8xt") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "kube-api-access-xs8xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.882531 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-76d49dbb9c-8g2mb"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.883742 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.895240 4732 scope.go:117] "RemoveContainer" containerID="4dd666a68eabadb0f0ffc4673eed0a478b682b215b386838970de85ec6a14574" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.895762 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-config-data" (OuterVolumeSpecName: "config-data") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.910329 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-secrets" (OuterVolumeSpecName: "secrets") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.919209 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.947343 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979142 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979169 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs8xt\" (UniqueName: \"kubernetes.io/projected/dbaa5798-1d07-445a-a226-ad48054d3dbc-kube-api-access-xs8xt\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979218 4732 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-secrets\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979228 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979237 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979246 4732 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979256 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979264 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbaa5798-1d07-445a-a226-ad48054d3dbc-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.979791 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.980710 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13bb7b78-cc62-4d3b-a33a-9af77ee9e141" (UID: "13bb7b78-cc62-4d3b-a33a-9af77ee9e141"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.984697 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 10 07:13:30 crc kubenswrapper[4732]: I1010 07:13:30.989840 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f34ab2c-f804-4f24-a447-165d5afb984f" (UID: "8f34ab2c-f804-4f24-a447-165d5afb984f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.008248 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" (UID: "55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.009719 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.010747 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2960d902-25b0-4fb8-baa7-fe7f9d4f5811" (UID: "2960d902-25b0-4fb8-baa7-fe7f9d4f5811"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.015838 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-config-data" (OuterVolumeSpecName: "config-data") pod "13bb7b78-cc62-4d3b-a33a-9af77ee9e141" (UID: "13bb7b78-cc62-4d3b-a33a-9af77ee9e141"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.021785 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-config-data" (OuterVolumeSpecName: "config-data") pod "8f34ab2c-f804-4f24-a447-165d5afb984f" (UID: "8f34ab2c-f804-4f24-a447-165d5afb984f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.028946 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.031454 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.041432 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data" (OuterVolumeSpecName: "config-data") pod "2960d902-25b0-4fb8-baa7-fe7f9d4f5811" (UID: "2960d902-25b0-4fb8-baa7-fe7f9d4f5811"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.061463 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2960d902-25b0-4fb8-baa7-fe7f9d4f5811" (UID: "2960d902-25b0-4fb8-baa7-fe7f9d4f5811"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.062708 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-config-data" (OuterVolumeSpecName: "config-data") pod "56077f87-ea67-4080-b328-7186a7d0bf35" (UID: "56077f87-ea67-4080-b328-7186a7d0bf35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.070978 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56077f87-ea67-4080-b328-7186a7d0bf35" (UID: "56077f87-ea67-4080-b328-7186a7d0bf35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.078441 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-config-data" (OuterVolumeSpecName: "config-data") pod "a570b39e-7329-4bba-bfe0-cf5f7aa2269e" (UID: "a570b39e-7329-4bba-bfe0-cf5f7aa2269e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081760 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081792 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081804 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081813 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081822 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081831 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13bb7b78-cc62-4d3b-a33a-9af77ee9e141-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081839 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081848 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081857 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081865 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081873 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081881 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081889 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081897 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a570b39e-7329-4bba-bfe0-cf5f7aa2269e-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081905 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.081913 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.089483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "ab930cd4-caad-4980-a491-8f6c5abca8bf" (UID: "ab930cd4-caad-4980-a491-8f6c5abca8bf"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.097476 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e37998e-491a-43b8-abda-4bdfea233217" (UID: "3e37998e-491a-43b8-abda-4bdfea233217"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.102406 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.119199 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.127992 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-config-data" (OuterVolumeSpecName: "config-data") pod "3e37998e-491a-43b8-abda-4bdfea233217" (UID: "3e37998e-491a-43b8-abda-4bdfea233217"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.128473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "56077f87-ea67-4080-b328-7186a7d0bf35" (UID: "56077f87-ea67-4080-b328-7186a7d0bf35"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.141061 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3e37998e-491a-43b8-abda-4bdfea233217" (UID: "3e37998e-491a-43b8-abda-4bdfea233217"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.141895 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2960d902-25b0-4fb8-baa7-fe7f9d4f5811" (UID: "2960d902-25b0-4fb8-baa7-fe7f9d4f5811"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.146459 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "dbaa5798-1d07-445a-a226-ad48054d3dbc" (UID: "dbaa5798-1d07-445a-a226-ad48054d3dbc"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.151016 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data" (OuterVolumeSpecName: "config-data") pod "cb179b69-8c25-49b1-88b5-6c17953ffbcd" (UID: "cb179b69-8c25-49b1-88b5-6c17953ffbcd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.164483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e37998e-491a-43b8-abda-4bdfea233217" (UID: "3e37998e-491a-43b8-abda-4bdfea233217"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.173935 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8f34ab2c-f804-4f24-a447-165d5afb984f" (UID: "8f34ab2c-f804-4f24-a447-165d5afb984f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.177957 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7949456448-wncp2" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.177976 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7949456448-wncp2" event={"ID":"2960d902-25b0-4fb8-baa7-fe7f9d4f5811","Type":"ContainerDied","Data":"82d576aaa00b31274d28aa29364ab9bbd5df67563209ffa142c0c03adbc3f704"} Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.178042 4732 scope.go:117] "RemoveContainer" containerID="d77b0880ffd05c296384bd3f19b5b8b3ab8e7f54824859cab96f74dddf1fd9e2" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183337 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183360 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2960d902-25b0-4fb8-baa7-fe7f9d4f5811-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183370 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183380 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183388 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183396 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f34ab2c-f804-4f24-a447-165d5afb984f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183404 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb179b69-8c25-49b1-88b5-6c17953ffbcd-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183412 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183420 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183429 4732 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbaa5798-1d07-445a-a226-ad48054d3dbc-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183437 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e37998e-491a-43b8-abda-4bdfea233217-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.183446 4732 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab930cd4-caad-4980-a491-8f6c5abca8bf-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.191233 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1a8d0-account-delete-tcwnj" event={"ID":"56a05e0a-f30f-4b7c-b939-eba8d0094d48","Type":"ContainerDied","Data":"830bd467873fa8b3a42ce8ae4d7d2c0fc08f60e2cf63a026d600c0a94aabca4e"} Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.191335 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1a8d0-account-delete-tcwnj" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.192345 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "56077f87-ea67-4080-b328-7186a7d0bf35" (UID: "56077f87-ea67-4080-b328-7186a7d0bf35"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.196271 4732 generic.go:334] "Generic (PLEG): container finished" podID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerID="eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8" exitCode=0 Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.196327 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbaa5798-1d07-445a-a226-ad48054d3dbc","Type":"ContainerDied","Data":"eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8"} Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.196353 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbaa5798-1d07-445a-a226-ad48054d3dbc","Type":"ContainerDied","Data":"26766aa0ea9f17688805b58b326f94a2932565778f9056ca41c585c549f20e5a"} Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.196387 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.213059 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.213060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"13bb7b78-cc62-4d3b-a33a-9af77ee9e141","Type":"ContainerDied","Data":"40b44a9fe0f27c82f8b65e4c05eb68c80f0b87a6c122db473eb69ca84fcd1ead"} Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.215720 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8f34ab2c-f804-4f24-a447-165d5afb984f","Type":"ContainerDied","Data":"33e429c269fea0b0f07bca5a0059bd6f53ecb56ba9f6d7232aead904e4d6fd67"} Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.215863 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.221315 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d445dfc98-wk5w4" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.224425 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.224795 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"56077f87-ea67-4080-b328-7186a7d0bf35","Type":"ContainerDied","Data":"2938f4066e9be1324b7a1269147e5f3064581b90ac433508b7f277c852ecc5d3"} Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.225728 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.225792 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.225822 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.225891 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.230625 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell00878-account-delete-pjh75" podUID="1f7ba305-07fd-408c-865f-463e3738e6cb" containerName="mariadb-account-delete" containerID="cri-o://bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7" gracePeriod=30 Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.230829 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi0b57-account-delete-95cjw" podUID="c76de706-34bc-4b37-8492-3573c19e91c2" containerName="mariadb-account-delete" containerID="cri-o://abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57" gracePeriod=30 Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.250865 4732 scope.go:117] "RemoveContainer" containerID="37340908e0e97c3a729ef1965ebc6960980d4f8695b1143d55d26a38eea03ce7" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.284651 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/56077f87-ea67-4080-b328-7186a7d0bf35-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.284764 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.284843 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data podName:88a11668-5ab6-4b77-8bb7-ac60140f4bd4 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:39.284824827 +0000 UTC m=+1346.354416068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data") pod "rabbitmq-cell1-server-0" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4") : configmap "rabbitmq-cell1-config-data" not found Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.301516 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7949456448-wncp2"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.309491 4732 scope.go:117] "RemoveContainer" containerID="d6f0336dc5707aabe42a9ded989244034dbfa84b62509eb392ec6856fd475828" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.311833 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7949456448-wncp2"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.353084 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1a8d0-account-delete-tcwnj"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.355348 4732 scope.go:117] "RemoveContainer" containerID="eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.364437 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1a8d0-account-delete-tcwnj"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.376190 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.386534 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.391471 4732 scope.go:117] "RemoveContainer" containerID="c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.397464 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.404823 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.427589 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.431238 4732 scope.go:117] "RemoveContainer" containerID="eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8" Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.434618 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8\": container with ID starting with eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8 not found: ID does not exist" containerID="eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.434723 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8"} err="failed to get container status \"eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8\": rpc error: code = NotFound desc = could not find container \"eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8\": container with ID starting with eaa4812cba54c55ecc20b810d7764f2480053373997e7a147258ac89b98d05a8 not found: ID does not exist" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.438325 4732 scope.go:117] "RemoveContainer" containerID="c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d" Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.440850 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d\": container with ID starting with c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d not found: ID does not exist" containerID="c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.440915 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d"} err="failed to get container status \"c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d\": rpc error: code = NotFound desc = could not find container \"c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d\": container with ID starting with c6bf0956814b6e81a25943afc9784ec17ecedbf14df5795a081cf7f7e70beb6d not found: ID does not exist" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.440932 4732 scope.go:117] "RemoveContainer" containerID="f0e9cf711cbd1a4e52c04fe7ccc2b60f0d3a1255135c6660bc4ba015dc4cc21c" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.441064 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.441096 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.444472 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.451717 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.458312 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.465480 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.470573 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.478900 4732 scope.go:117] "RemoveContainer" containerID="28679c08d1706b7a047ce63e0dbc74864b4cc97372d8f26b15d33225fb8a8912" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.482383 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.488484 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.503368 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.503423 4732 scope.go:117] "RemoveContainer" containerID="aaf10e61456ff1882f86011615c89d1ec3d16648bdf71900a294e95ad885a047" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.509346 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.516911 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d445dfc98-wk5w4"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.522019 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5d445dfc98-wk5w4"] Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.577543 4732 scope.go:117] "RemoveContainer" containerID="521cf68574ce4f3c728de951f4d7a2e5c5c7da7a4c60aac8f61aad22cdb5008d" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.600555 4732 scope.go:117] "RemoveContainer" containerID="101a8d6951b18acd6a4c085a8be7084960161610e3440bd4359884be73a60369" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.674279 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13bb7b78-cc62-4d3b-a33a-9af77ee9e141" path="/var/lib/kubelet/pods/13bb7b78-cc62-4d3b-a33a-9af77ee9e141/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.675335 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" path="/var/lib/kubelet/pods/27b69405-bc4b-4e39-be49-0a966bc649bb/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.676139 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" path="/var/lib/kubelet/pods/2960d902-25b0-4fb8-baa7-fe7f9d4f5811/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.677218 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e37998e-491a-43b8-abda-4bdfea233217" path="/var/lib/kubelet/pods/3e37998e-491a-43b8-abda-4bdfea233217/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.677674 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea62a47-1d15-41a2-a0d0-a0456a46183a" path="/var/lib/kubelet/pods/4ea62a47-1d15-41a2-a0d0-a0456a46183a/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.678175 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="521796db-f8be-41d3-a251-3ba1101d99bc" path="/var/lib/kubelet/pods/521796db-f8be-41d3-a251-3ba1101d99bc/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.679122 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" path="/var/lib/kubelet/pods/55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.680016 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" path="/var/lib/kubelet/pods/56077f87-ea67-4080-b328-7186a7d0bf35/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.680490 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56a05e0a-f30f-4b7c-b939-eba8d0094d48" path="/var/lib/kubelet/pods/56a05e0a-f30f-4b7c-b939-eba8d0094d48/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.681534 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" path="/var/lib/kubelet/pods/63706a24-ebfd-45ae-96b0-49ab7bd13fdf/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.682089 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d536f03-ccb0-4f7f-9d6a-8e2250557ecb" path="/var/lib/kubelet/pods/6d536f03-ccb0-4f7f-9d6a-8e2250557ecb/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.682560 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710f9fa6-588e-4226-a65d-5220d0a1f315" path="/var/lib/kubelet/pods/710f9fa6-588e-4226-a65d-5220d0a1f315/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.683486 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" path="/var/lib/kubelet/pods/7daaf3e5-82f0-45f7-aa22-40be65433320/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.683978 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9aff96-840f-4c4c-8ae2-349dd76e614e" path="/var/lib/kubelet/pods/7f9aff96-840f-4c4c-8ae2-349dd76e614e/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.684474 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d80b654-a26e-46ea-84f4-264c3c883250" path="/var/lib/kubelet/pods/8d80b654-a26e-46ea-84f4-264c3c883250/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.685494 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" path="/var/lib/kubelet/pods/8f34ab2c-f804-4f24-a447-165d5afb984f/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.686086 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" path="/var/lib/kubelet/pods/a570b39e-7329-4bba-bfe0-cf5f7aa2269e/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.687007 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab930cd4-caad-4980-a491-8f6c5abca8bf" path="/var/lib/kubelet/pods/ab930cd4-caad-4980-a491-8f6c5abca8bf/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.688034 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" path="/var/lib/kubelet/pods/cb179b69-8c25-49b1-88b5-6c17953ffbcd/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.688602 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0fa844d-f411-49a9-a52f-256760a71157" path="/var/lib/kubelet/pods/d0fa844d-f411-49a9-a52f-256760a71157/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.689172 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" path="/var/lib/kubelet/pods/d4e6ee2c-4b3b-48af-860d-f23aea3c4c85/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.690082 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbaa5798-1d07-445a-a226-ad48054d3dbc" path="/var/lib/kubelet/pods/dbaa5798-1d07-445a-a226-ad48054d3dbc/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.690821 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2fb6fd2-36fc-4a19-8462-f59d719b09d9" path="/var/lib/kubelet/pods/f2fb6fd2-36fc-4a19-8462-f59d719b09d9/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.691706 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" path="/var/lib/kubelet/pods/f5d96c35-c01e-4f12-ab12-7b6342789b2f/volumes" Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.793211 4732 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.793300 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data podName:565f831c-0da8-4481-8461-8522e0cfa801 nodeName:}" failed. No retries permitted until 2025-10-10 07:13:39.793282034 +0000 UTC m=+1346.862873275 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data") pod "rabbitmq-server-0" (UID: "565f831c-0da8-4481-8461-8522e0cfa801") : configmap "rabbitmq-config-data" not found Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.856117 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43 is running failed: container process not found" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.856352 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43 is running failed: container process not found" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.856558 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43 is running failed: container process not found" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 10 07:13:31 crc kubenswrapper[4732]: E1010 07:13:31.856585 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="ovn-northd" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.922029 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.996142 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pglx\" (UniqueName: \"kubernetes.io/projected/b93e689a-691a-403b-970f-63547469bbfe-kube-api-access-4pglx\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:31 crc kubenswrapper[4732]: I1010 07:13:31.997589 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-public-tls-certs\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.014271 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-fernet-keys\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.014338 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-credential-keys\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.014366 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-config-data\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.014425 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-internal-tls-certs\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.014477 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-scripts\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.014494 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.009067 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93e689a-691a-403b-970f-63547469bbfe-kube-api-access-4pglx" (OuterVolumeSpecName: "kube-api-access-4pglx") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "kube-api-access-4pglx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.033458 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-scripts" (OuterVolumeSpecName: "scripts") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.042053 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.042717 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.097601 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.097847 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: E1010 07:13:32.113276 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle podName:b93e689a-691a-403b-970f-63547469bbfe nodeName:}" failed. No retries permitted until 2025-10-10 07:13:32.613246604 +0000 UTC m=+1339.682837845 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe") : error deleting /var/lib/kubelet/pods/b93e689a-691a-403b-970f-63547469bbfe/volume-subpaths: remove /var/lib/kubelet/pods/b93e689a-691a-403b-970f-63547469bbfe/volume-subpaths: no such file or directory Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.116192 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-config-data" (OuterVolumeSpecName: "config-data") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.117990 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.118007 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.118015 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pglx\" (UniqueName: \"kubernetes.io/projected/b93e689a-691a-403b-970f-63547469bbfe-kube-api-access-4pglx\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.118026 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.118033 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.118041 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.118049 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.236566 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_99f1b967-cc4d-4092-87e9-64cbbc84be27/ovn-northd/0.log" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.236637 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.244678 4732 generic.go:334] "Generic (PLEG): container finished" podID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerID="84a5b3ebb026e19550cf8d398201da96d41bac755d18b33cc928544d7a6cf2c5" exitCode=0 Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.244744 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"88a11668-5ab6-4b77-8bb7-ac60140f4bd4","Type":"ContainerDied","Data":"84a5b3ebb026e19550cf8d398201da96d41bac755d18b33cc928544d7a6cf2c5"} Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.247441 4732 generic.go:334] "Generic (PLEG): container finished" podID="b93e689a-691a-403b-970f-63547469bbfe" containerID="e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1" exitCode=0 Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.247486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8c7d5b696-rkhkz" event={"ID":"b93e689a-691a-403b-970f-63547469bbfe","Type":"ContainerDied","Data":"e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1"} Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.247503 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8c7d5b696-rkhkz" event={"ID":"b93e689a-691a-403b-970f-63547469bbfe","Type":"ContainerDied","Data":"c125154cabcdfe5999b14b69e6aec32e2bc02f073f7c1b37d5c7e96e975bcddc"} Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.247519 4732 scope.go:117] "RemoveContainer" containerID="e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.247653 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8c7d5b696-rkhkz" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.251632 4732 generic.go:334] "Generic (PLEG): container finished" podID="565f831c-0da8-4481-8461-8522e0cfa801" containerID="f7527cba13db589dd756b76500f8bbf94063e072e3428b0e50a4da46b0e63723" exitCode=0 Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.251674 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"565f831c-0da8-4481-8461-8522e0cfa801","Type":"ContainerDied","Data":"f7527cba13db589dd756b76500f8bbf94063e072e3428b0e50a4da46b0e63723"} Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.263013 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_99f1b967-cc4d-4092-87e9-64cbbc84be27/ovn-northd/0.log" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.263059 4732 generic.go:334] "Generic (PLEG): container finished" podID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" exitCode=139 Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.263117 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1b967-cc4d-4092-87e9-64cbbc84be27","Type":"ContainerDied","Data":"47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43"} Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.263144 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"99f1b967-cc4d-4092-87e9-64cbbc84be27","Type":"ContainerDied","Data":"63367abe9400d87b65d7cb4a168835c3c70c96a9997d7ffed71066f71459682c"} Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.263151 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.288445 4732 scope.go:117] "RemoveContainer" containerID="e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1" Oct 10 07:13:32 crc kubenswrapper[4732]: E1010 07:13:32.288996 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1\": container with ID starting with e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1 not found: ID does not exist" containerID="e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.289028 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1"} err="failed to get container status \"e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1\": rpc error: code = NotFound desc = could not find container \"e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1\": container with ID starting with e18ca7b3349ea105355ed9246a1c634aeb0a535e3d6a75c9256d2e11405998f1 not found: ID does not exist" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.289054 4732 scope.go:117] "RemoveContainer" containerID="1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.306245 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.319656 4732 scope.go:117] "RemoveContainer" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.320237 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-config\") pod \"99f1b967-cc4d-4092-87e9-64cbbc84be27\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.320337 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-combined-ca-bundle\") pod \"99f1b967-cc4d-4092-87e9-64cbbc84be27\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.320413 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-rundir\") pod \"99f1b967-cc4d-4092-87e9-64cbbc84be27\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.320474 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-metrics-certs-tls-certs\") pod \"99f1b967-cc4d-4092-87e9-64cbbc84be27\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.320545 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-northd-tls-certs\") pod \"99f1b967-cc4d-4092-87e9-64cbbc84be27\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.320594 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-scripts\") pod \"99f1b967-cc4d-4092-87e9-64cbbc84be27\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.320652 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6hnt\" (UniqueName: \"kubernetes.io/projected/99f1b967-cc4d-4092-87e9-64cbbc84be27-kube-api-access-c6hnt\") pod \"99f1b967-cc4d-4092-87e9-64cbbc84be27\" (UID: \"99f1b967-cc4d-4092-87e9-64cbbc84be27\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.323743 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "99f1b967-cc4d-4092-87e9-64cbbc84be27" (UID: "99f1b967-cc4d-4092-87e9-64cbbc84be27"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.323993 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-scripts" (OuterVolumeSpecName: "scripts") pod "99f1b967-cc4d-4092-87e9-64cbbc84be27" (UID: "99f1b967-cc4d-4092-87e9-64cbbc84be27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.324139 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-config" (OuterVolumeSpecName: "config") pod "99f1b967-cc4d-4092-87e9-64cbbc84be27" (UID: "99f1b967-cc4d-4092-87e9-64cbbc84be27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.324417 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f1b967-cc4d-4092-87e9-64cbbc84be27-kube-api-access-c6hnt" (OuterVolumeSpecName: "kube-api-access-c6hnt") pod "99f1b967-cc4d-4092-87e9-64cbbc84be27" (UID: "99f1b967-cc4d-4092-87e9-64cbbc84be27"). InnerVolumeSpecName "kube-api-access-c6hnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.352375 4732 scope.go:117] "RemoveContainer" containerID="1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a" Oct 10 07:13:32 crc kubenswrapper[4732]: E1010 07:13:32.352982 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a\": container with ID starting with 1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a not found: ID does not exist" containerID="1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.353018 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a"} err="failed to get container status \"1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a\": rpc error: code = NotFound desc = could not find container \"1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a\": container with ID starting with 1579364003b92c9bdec34e10b8427af125286f39dac8ba10a722fffea0ec366a not found: ID does not exist" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.353044 4732 scope.go:117] "RemoveContainer" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" Oct 10 07:13:32 crc kubenswrapper[4732]: E1010 07:13:32.353311 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43\": container with ID starting with 47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43 not found: ID does not exist" containerID="47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.353348 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43"} err="failed to get container status \"47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43\": rpc error: code = NotFound desc = could not find container \"47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43\": container with ID starting with 47db53a5421ceea9d4dff3d185b5c9253f8f5094e70b3d84e7d374fe2935ed43 not found: ID does not exist" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.366339 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99f1b967-cc4d-4092-87e9-64cbbc84be27" (UID: "99f1b967-cc4d-4092-87e9-64cbbc84be27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.406176 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "99f1b967-cc4d-4092-87e9-64cbbc84be27" (UID: "99f1b967-cc4d-4092-87e9-64cbbc84be27"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.418987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "99f1b967-cc4d-4092-87e9-64cbbc84be27" (UID: "99f1b967-cc4d-4092-87e9-64cbbc84be27"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422373 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-plugins\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422453 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-confd\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422480 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-tls\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422512 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-erlang-cookie-secret\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422577 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-server-conf\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422627 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-erlang-cookie\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422671 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422720 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-plugins-conf\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vthqm\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-kube-api-access-vthqm\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422770 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-pod-info\") pod \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\" (UID: \"88a11668-5ab6-4b77-8bb7-ac60140f4bd4\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.422777 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423040 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423053 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423064 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423077 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/99f1b967-cc4d-4092-87e9-64cbbc84be27-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423215 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423232 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6hnt\" (UniqueName: \"kubernetes.io/projected/99f1b967-cc4d-4092-87e9-64cbbc84be27-kube-api-access-c6hnt\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423243 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99f1b967-cc4d-4092-87e9-64cbbc84be27-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423254 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423111 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.423406 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.426230 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-kube-api-access-vthqm" (OuterVolumeSpecName: "kube-api-access-vthqm") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "kube-api-access-vthqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.426928 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.428434 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.429927 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.435260 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-pod-info" (OuterVolumeSpecName: "pod-info") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.446869 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data" (OuterVolumeSpecName: "config-data") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.482930 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-server-conf" (OuterVolumeSpecName: "server-conf") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.484246 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.508464 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "88a11668-5ab6-4b77-8bb7-ac60140f4bd4" (UID: "88a11668-5ab6-4b77-8bb7-ac60140f4bd4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.524644 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/565f831c-0da8-4481-8461-8522e0cfa801-pod-info\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.524765 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl8rp\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-kube-api-access-xl8rp\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.524818 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/565f831c-0da8-4481-8461-8522e0cfa801-erlang-cookie-secret\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.524882 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-server-conf\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.524906 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.524962 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-plugins\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.524987 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-plugins-conf\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525033 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-tls\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525071 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-confd\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525106 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-erlang-cookie\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525131 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"565f831c-0da8-4481-8461-8522e0cfa801\" (UID: \"565f831c-0da8-4481-8461-8522e0cfa801\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525557 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525573 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525582 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525590 4732 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525601 4732 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525609 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525631 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525640 4732 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525648 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vthqm\" (UniqueName: \"kubernetes.io/projected/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-kube-api-access-vthqm\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525656 4732 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/88a11668-5ab6-4b77-8bb7-ac60140f4bd4-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.525876 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.526397 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.527799 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565f831c-0da8-4481-8461-8522e0cfa801-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.527891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-kube-api-access-xl8rp" (OuterVolumeSpecName: "kube-api-access-xl8rp") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "kube-api-access-xl8rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.528181 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.531281 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.533862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/565f831c-0da8-4481-8461-8522e0cfa801-pod-info" (OuterVolumeSpecName: "pod-info") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.534520 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.548180 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data" (OuterVolumeSpecName: "config-data") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.555492 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.563103 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-server-conf" (OuterVolumeSpecName: "server-conf") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.599964 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.604264 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.610672 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "565f831c-0da8-4481-8461-8522e0cfa801" (UID: "565f831c-0da8-4481-8461-8522e0cfa801"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627164 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle\") pod \"b93e689a-691a-403b-970f-63547469bbfe\" (UID: \"b93e689a-691a-403b-970f-63547469bbfe\") " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627534 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl8rp\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-kube-api-access-xl8rp\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627554 4732 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/565f831c-0da8-4481-8461-8522e0cfa801-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627564 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627574 4732 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627583 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627591 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627612 4732 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/565f831c-0da8-4481-8461-8522e0cfa801-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627622 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627660 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627676 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/565f831c-0da8-4481-8461-8522e0cfa801-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627720 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.627730 4732 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/565f831c-0da8-4481-8461-8522e0cfa801-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.629669 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b93e689a-691a-403b-970f-63547469bbfe" (UID: "b93e689a-691a-403b-970f-63547469bbfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.643109 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.728572 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b93e689a-691a-403b-970f-63547469bbfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.729488 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:32 crc kubenswrapper[4732]: E1010 07:13:32.850204 4732 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 10 07:13:32 crc kubenswrapper[4732]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-10T07:13:25Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 10 07:13:32 crc kubenswrapper[4732]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Oct 10 07:13:32 crc kubenswrapper[4732]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-lzkzk" message=< Oct 10 07:13:32 crc kubenswrapper[4732]: Exiting ovn-controller (1) [FAILED] Oct 10 07:13:32 crc kubenswrapper[4732]: Killing ovn-controller (1) [ OK ] Oct 10 07:13:32 crc kubenswrapper[4732]: Killing ovn-controller (1) with SIGKILL [ OK ] Oct 10 07:13:32 crc kubenswrapper[4732]: 2025-10-10T07:13:25Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 10 07:13:32 crc kubenswrapper[4732]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Oct 10 07:13:32 crc kubenswrapper[4732]: > Oct 10 07:13:32 crc kubenswrapper[4732]: E1010 07:13:32.850256 4732 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 10 07:13:32 crc kubenswrapper[4732]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-10-10T07:13:25Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Oct 10 07:13:32 crc kubenswrapper[4732]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Oct 10 07:13:32 crc kubenswrapper[4732]: > pod="openstack/ovn-controller-lzkzk" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" containerID="cri-o://5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71" Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.850309 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-lzkzk" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" containerID="cri-o://5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71" gracePeriod=22 Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.962051 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8c7d5b696-rkhkz"] Oct 10 07:13:32 crc kubenswrapper[4732]: I1010 07:13:32.967518 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8c7d5b696-rkhkz"] Oct 10 07:13:33 crc kubenswrapper[4732]: E1010 07:13:33.004918 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:13:33 crc kubenswrapper[4732]: E1010 07:13:33.016593 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:13:33 crc kubenswrapper[4732]: E1010 07:13:33.018116 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 07:13:33 crc kubenswrapper[4732]: E1010 07:13:33.018153 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="53c7a322-6bdd-4613-9a25-39391becbb81" containerName="nova-scheduler-scheduler" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.237761 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lzkzk_b8c3140a-2ab2-44f7-9ddd-73de883c4b65/ovn-controller/0.log" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.237840 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.298372 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lzkzk_b8c3140a-2ab2-44f7-9ddd-73de883c4b65/ovn-controller/0.log" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.298428 4732 generic.go:334] "Generic (PLEG): container finished" podID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerID="5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71" exitCode=137 Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.298481 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzkzk" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.298511 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk" event={"ID":"b8c3140a-2ab2-44f7-9ddd-73de883c4b65","Type":"ContainerDied","Data":"5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71"} Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.298542 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzkzk" event={"ID":"b8c3140a-2ab2-44f7-9ddd-73de883c4b65","Type":"ContainerDied","Data":"23d65ed8a4a4ec316d44a17b7d44876f3ef058d60c18fdcd3723fd327bf7237f"} Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.298559 4732 scope.go:117] "RemoveContainer" containerID="5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.300669 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"88a11668-5ab6-4b77-8bb7-ac60140f4bd4","Type":"ContainerDied","Data":"4627ef1862c57ebe1d5e5358c0926948ebf0f4f721e9cb1259ad69071f5ddd3e"} Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.300722 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.304362 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"565f831c-0da8-4481-8461-8522e0cfa801","Type":"ContainerDied","Data":"f8c832e67f10479d00bf02a9a6f6bf1974153e25af20ac26b9e5edf78b3a8e27"} Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.304424 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.337934 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-scripts\") pod \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338186 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run\") pod \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338273 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run-ovn\") pod \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338345 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-combined-ca-bundle\") pod \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338424 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-ovn-controller-tls-certs\") pod \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338486 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-log-ovn\") pod \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338566 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v72s7\" (UniqueName: \"kubernetes.io/projected/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-kube-api-access-v72s7\") pod \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\" (UID: \"b8c3140a-2ab2-44f7-9ddd-73de883c4b65\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338450 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run" (OuterVolumeSpecName: "var-run") pod "b8c3140a-2ab2-44f7-9ddd-73de883c4b65" (UID: "b8c3140a-2ab2-44f7-9ddd-73de883c4b65"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.338489 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b8c3140a-2ab2-44f7-9ddd-73de883c4b65" (UID: "b8c3140a-2ab2-44f7-9ddd-73de883c4b65"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.339261 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-scripts" (OuterVolumeSpecName: "scripts") pod "b8c3140a-2ab2-44f7-9ddd-73de883c4b65" (UID: "b8c3140a-2ab2-44f7-9ddd-73de883c4b65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.339363 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b8c3140a-2ab2-44f7-9ddd-73de883c4b65" (UID: "b8c3140a-2ab2-44f7-9ddd-73de883c4b65"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.343523 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-kube-api-access-v72s7" (OuterVolumeSpecName: "kube-api-access-v72s7") pod "b8c3140a-2ab2-44f7-9ddd-73de883c4b65" (UID: "b8c3140a-2ab2-44f7-9ddd-73de883c4b65"). InnerVolumeSpecName "kube-api-access-v72s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.359938 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8c3140a-2ab2-44f7-9ddd-73de883c4b65" (UID: "b8c3140a-2ab2-44f7-9ddd-73de883c4b65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.402631 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "b8c3140a-2ab2-44f7-9ddd-73de883c4b65" (UID: "b8c3140a-2ab2-44f7-9ddd-73de883c4b65"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.439719 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.439950 4732 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.439958 4732 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.439968 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.439979 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.439987 4732 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.439996 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v72s7\" (UniqueName: \"kubernetes.io/projected/b8c3140a-2ab2-44f7-9ddd-73de883c4b65-kube-api-access-v72s7\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.451039 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.455161 4732 scope.go:117] "RemoveContainer" containerID="5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71" Oct 10 07:13:33 crc kubenswrapper[4732]: E1010 07:13:33.456592 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71\": container with ID starting with 5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71 not found: ID does not exist" containerID="5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.456630 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71"} err="failed to get container status \"5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71\": rpc error: code = NotFound desc = could not find container \"5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71\": container with ID starting with 5c9c4c423fcd67fd4746428afe115a4932f60efe5d6cd5f56a001789e318af71 not found: ID does not exist" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.456658 4732 scope.go:117] "RemoveContainer" containerID="84a5b3ebb026e19550cf8d398201da96d41bac755d18b33cc928544d7a6cf2c5" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.469895 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.481149 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.485575 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.486932 4732 scope.go:117] "RemoveContainer" containerID="3ddbabed55e78f709270c00c818e9ba3b1b86ff17c658889d1c920cecadb8ebc" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.555188 4732 scope.go:117] "RemoveContainer" containerID="f7527cba13db589dd756b76500f8bbf94063e072e3428b0e50a4da46b0e63723" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.627448 4732 scope.go:117] "RemoveContainer" containerID="22b06feca3a6572b5d530d56c80f67e3ad45b92fa5b1fd8735418a9965bcc5fe" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.631416 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lzkzk"] Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.639791 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lzkzk"] Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.673095 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565f831c-0da8-4481-8461-8522e0cfa801" path="/var/lib/kubelet/pods/565f831c-0da8-4481-8461-8522e0cfa801/volumes" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.674265 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" path="/var/lib/kubelet/pods/88a11668-5ab6-4b77-8bb7-ac60140f4bd4/volumes" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.675496 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" path="/var/lib/kubelet/pods/99f1b967-cc4d-4092-87e9-64cbbc84be27/volumes" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.676139 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" path="/var/lib/kubelet/pods/b8c3140a-2ab2-44f7-9ddd-73de883c4b65/volumes" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.676757 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93e689a-691a-403b-970f-63547469bbfe" path="/var/lib/kubelet/pods/b93e689a-691a-403b-970f-63547469bbfe/volumes" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.681337 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743315 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-ceilometer-tls-certs\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743363 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-log-httpd\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743385 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-sg-core-conf-yaml\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743427 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-run-httpd\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743459 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-scripts\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743481 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-config-data\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743531 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-combined-ca-bundle\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.743557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl4hr\" (UniqueName: \"kubernetes.io/projected/ed592ee3-6dab-41d4-8141-bb7c31b02f73-kube-api-access-rl4hr\") pod \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\" (UID: \"ed592ee3-6dab-41d4-8141-bb7c31b02f73\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.753321 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.757549 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed592ee3-6dab-41d4-8141-bb7c31b02f73-kube-api-access-rl4hr" (OuterVolumeSpecName: "kube-api-access-rl4hr") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "kube-api-access-rl4hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.758188 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.762005 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-scripts" (OuterVolumeSpecName: "scripts") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.774183 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.781318 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.799355 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.803022 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.825386 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-config-data" (OuterVolumeSpecName: "config-data") pod "ed592ee3-6dab-41d4-8141-bb7c31b02f73" (UID: "ed592ee3-6dab-41d4-8141-bb7c31b02f73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.845635 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-combined-ca-bundle\") pod \"53c7a322-6bdd-4613-9a25-39391becbb81\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.845747 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmvd8\" (UniqueName: \"kubernetes.io/projected/53c7a322-6bdd-4613-9a25-39391becbb81-kube-api-access-qmvd8\") pod \"53c7a322-6bdd-4613-9a25-39391becbb81\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846005 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-config-data\") pod \"53c7a322-6bdd-4613-9a25-39391becbb81\" (UID: \"53c7a322-6bdd-4613-9a25-39391becbb81\") " Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846325 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846342 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846351 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846360 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ed592ee3-6dab-41d4-8141-bb7c31b02f73-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846369 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846379 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846388 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed592ee3-6dab-41d4-8141-bb7c31b02f73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.846396 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl4hr\" (UniqueName: \"kubernetes.io/projected/ed592ee3-6dab-41d4-8141-bb7c31b02f73-kube-api-access-rl4hr\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.849932 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c7a322-6bdd-4613-9a25-39391becbb81-kube-api-access-qmvd8" (OuterVolumeSpecName: "kube-api-access-qmvd8") pod "53c7a322-6bdd-4613-9a25-39391becbb81" (UID: "53c7a322-6bdd-4613-9a25-39391becbb81"). InnerVolumeSpecName "kube-api-access-qmvd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.865172 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-config-data" (OuterVolumeSpecName: "config-data") pod "53c7a322-6bdd-4613-9a25-39391becbb81" (UID: "53c7a322-6bdd-4613-9a25-39391becbb81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.866156 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53c7a322-6bdd-4613-9a25-39391becbb81" (UID: "53c7a322-6bdd-4613-9a25-39391becbb81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.947994 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.948029 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53c7a322-6bdd-4613-9a25-39391becbb81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:33 crc kubenswrapper[4732]: I1010 07:13:33.948041 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmvd8\" (UniqueName: \"kubernetes.io/projected/53c7a322-6bdd-4613-9a25-39391becbb81-kube-api-access-qmvd8\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.320563 4732 generic.go:334] "Generic (PLEG): container finished" podID="53c7a322-6bdd-4613-9a25-39391becbb81" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" exitCode=0 Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.320618 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53c7a322-6bdd-4613-9a25-39391becbb81","Type":"ContainerDied","Data":"33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711"} Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.320660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"53c7a322-6bdd-4613-9a25-39391becbb81","Type":"ContainerDied","Data":"3fa6fe5f18d94b502e14fd85cd60727f160f0c09b94c48b0078b09ca87abf84e"} Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.320682 4732 scope.go:117] "RemoveContainer" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.320722 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.330751 4732 generic.go:334] "Generic (PLEG): container finished" podID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerID="d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812" exitCode=0 Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.330796 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerDied","Data":"d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812"} Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.330823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ed592ee3-6dab-41d4-8141-bb7c31b02f73","Type":"ContainerDied","Data":"250e04f22958b91dd371dc791936e49052929cd02de81d757e6bcde53c1a602f"} Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.330930 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.363663 4732 scope.go:117] "RemoveContainer" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.368156 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711\": container with ID starting with 33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711 not found: ID does not exist" containerID="33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.368232 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711"} err="failed to get container status \"33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711\": rpc error: code = NotFound desc = could not find container \"33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711\": container with ID starting with 33a5fde174872bf1c9cadfb89907ea642caf2449b15d4c1d6ffaa6064efae711 not found: ID does not exist" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.368293 4732 scope.go:117] "RemoveContainer" containerID="f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.396816 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.403244 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.409311 4732 scope.go:117] "RemoveContainer" containerID="7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.409599 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.415813 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.432867 4732 scope.go:117] "RemoveContainer" containerID="d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812" Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.454119 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.454958 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.455486 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.455531 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.456192 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.457659 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.458836 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.458864 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.470996 4732 scope.go:117] "RemoveContainer" containerID="03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.507666 4732 scope.go:117] "RemoveContainer" containerID="f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1" Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.508276 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1\": container with ID starting with f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1 not found: ID does not exist" containerID="f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.508356 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1"} err="failed to get container status \"f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1\": rpc error: code = NotFound desc = could not find container \"f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1\": container with ID starting with f8ca0942be97e2748919aa11222f2f72b00d5bf13eab8cc9e369f3619490dfd1 not found: ID does not exist" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.508392 4732 scope.go:117] "RemoveContainer" containerID="7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd" Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.508890 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd\": container with ID starting with 7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd not found: ID does not exist" containerID="7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.508914 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd"} err="failed to get container status \"7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd\": rpc error: code = NotFound desc = could not find container \"7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd\": container with ID starting with 7aab4286266ff3694798b6622c3ba0e86642cad9a87dca5893437ef0db795afd not found: ID does not exist" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.508928 4732 scope.go:117] "RemoveContainer" containerID="d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812" Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.509319 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812\": container with ID starting with d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812 not found: ID does not exist" containerID="d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.509340 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812"} err="failed to get container status \"d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812\": rpc error: code = NotFound desc = could not find container \"d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812\": container with ID starting with d54b4043e46106f60c27c13517cb1eb673d8815f5a8634e127a9ff1f02a42812 not found: ID does not exist" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.509352 4732 scope.go:117] "RemoveContainer" containerID="03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12" Oct 10 07:13:34 crc kubenswrapper[4732]: E1010 07:13:34.509638 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12\": container with ID starting with 03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12 not found: ID does not exist" containerID="03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12" Oct 10 07:13:34 crc kubenswrapper[4732]: I1010 07:13:34.509679 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12"} err="failed to get container status \"03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12\": rpc error: code = NotFound desc = could not find container \"03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12\": container with ID starting with 03d45489bbf3bf28e9643c2fda5c997cf37b3cf4eb0b6f6171b61ddaec8ffb12 not found: ID does not exist" Oct 10 07:13:35 crc kubenswrapper[4732]: I1010 07:13:35.209158 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 07:13:35 crc kubenswrapper[4732]: I1010 07:13:35.209273 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 10 07:13:35 crc kubenswrapper[4732]: I1010 07:13:35.682809 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c7a322-6bdd-4613-9a25-39391becbb81" path="/var/lib/kubelet/pods/53c7a322-6bdd-4613-9a25-39391becbb81/volumes" Oct 10 07:13:35 crc kubenswrapper[4732]: I1010 07:13:35.683926 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" path="/var/lib/kubelet/pods/ed592ee3-6dab-41d4-8141-bb7c31b02f73/volumes" Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.454819 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.457981 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.458002 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.459226 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.459400 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.466082 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.468249 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:39 crc kubenswrapper[4732]: E1010 07:13:39.468335 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.411442 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerID="2d672d5afe033cb8d13e4990cc214b86467df1b5080405aeaab34bdda430f497" exitCode=0 Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.411549 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785547cb47-x77nc" event={"ID":"eb94a64c-1a0c-4a61-bb69-e843b627cf35","Type":"ContainerDied","Data":"2d672d5afe033cb8d13e4990cc214b86467df1b5080405aeaab34bdda430f497"} Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.695182 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.810858 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-internal-tls-certs\") pod \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.810992 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-config\") pod \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.811119 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-ovndb-tls-certs\") pod \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.811145 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqnpx\" (UniqueName: \"kubernetes.io/projected/eb94a64c-1a0c-4a61-bb69-e843b627cf35-kube-api-access-lqnpx\") pod \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.811223 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-combined-ca-bundle\") pod \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.811293 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-public-tls-certs\") pod \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.811321 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-httpd-config\") pod \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\" (UID: \"eb94a64c-1a0c-4a61-bb69-e843b627cf35\") " Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.827663 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eb94a64c-1a0c-4a61-bb69-e843b627cf35" (UID: "eb94a64c-1a0c-4a61-bb69-e843b627cf35"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.827812 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb94a64c-1a0c-4a61-bb69-e843b627cf35-kube-api-access-lqnpx" (OuterVolumeSpecName: "kube-api-access-lqnpx") pod "eb94a64c-1a0c-4a61-bb69-e843b627cf35" (UID: "eb94a64c-1a0c-4a61-bb69-e843b627cf35"). InnerVolumeSpecName "kube-api-access-lqnpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.855665 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-config" (OuterVolumeSpecName: "config") pod "eb94a64c-1a0c-4a61-bb69-e843b627cf35" (UID: "eb94a64c-1a0c-4a61-bb69-e843b627cf35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.864536 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "eb94a64c-1a0c-4a61-bb69-e843b627cf35" (UID: "eb94a64c-1a0c-4a61-bb69-e843b627cf35"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.867545 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb94a64c-1a0c-4a61-bb69-e843b627cf35" (UID: "eb94a64c-1a0c-4a61-bb69-e843b627cf35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.870646 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eb94a64c-1a0c-4a61-bb69-e843b627cf35" (UID: "eb94a64c-1a0c-4a61-bb69-e843b627cf35"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.890414 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "eb94a64c-1a0c-4a61-bb69-e843b627cf35" (UID: "eb94a64c-1a0c-4a61-bb69-e843b627cf35"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.913399 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.913436 4732 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.913453 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqnpx\" (UniqueName: \"kubernetes.io/projected/eb94a64c-1a0c-4a61-bb69-e843b627cf35-kube-api-access-lqnpx\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.913465 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.913477 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.913489 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:41 crc kubenswrapper[4732]: I1010 07:13:41.913501 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb94a64c-1a0c-4a61-bb69-e843b627cf35-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:42 crc kubenswrapper[4732]: I1010 07:13:42.428376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-785547cb47-x77nc" event={"ID":"eb94a64c-1a0c-4a61-bb69-e843b627cf35","Type":"ContainerDied","Data":"2126a92cc04a08ec64588b8ed283890c1fdd15a9133abe73dc8b540050740668"} Oct 10 07:13:42 crc kubenswrapper[4732]: I1010 07:13:42.428469 4732 scope.go:117] "RemoveContainer" containerID="7c102b472c047348ed7dff4aff9894c0cde366c8c678aa7233770836081af19e" Oct 10 07:13:42 crc kubenswrapper[4732]: I1010 07:13:42.429976 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-785547cb47-x77nc" Oct 10 07:13:42 crc kubenswrapper[4732]: I1010 07:13:42.459313 4732 scope.go:117] "RemoveContainer" containerID="2d672d5afe033cb8d13e4990cc214b86467df1b5080405aeaab34bdda430f497" Oct 10 07:13:42 crc kubenswrapper[4732]: I1010 07:13:42.482869 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-785547cb47-x77nc"] Oct 10 07:13:42 crc kubenswrapper[4732]: I1010 07:13:42.492805 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-785547cb47-x77nc"] Oct 10 07:13:43 crc kubenswrapper[4732]: I1010 07:13:43.678288 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" path="/var/lib/kubelet/pods/eb94a64c-1a0c-4a61-bb69-e843b627cf35/volumes" Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.454916 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.455196 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.455631 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.455682 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.455854 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.456924 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.457852 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:44 crc kubenswrapper[4732]: E1010 07:13:44.457921 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.455626 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.457225 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.457603 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.457643 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.459482 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.461474 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.462877 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:49 crc kubenswrapper[4732]: E1010 07:13:49.462955 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.454318 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.455147 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.456005 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.456058 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.457742 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.458978 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.464605 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 10 07:13:54 crc kubenswrapper[4732]: E1010 07:13:54.464650 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-n9v88" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.065238 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9v88_3cddfa4e-ec03-4651-ae7b-87b2dc3ec030/ovs-vswitchd/0.log" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.066190 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231203 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-log\") pod \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231471 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-etc-ovs\") pod \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231322 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-log" (OuterVolumeSpecName: "var-log") pod "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" (UID: "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231497 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-run\") pod \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231516 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-run" (OuterVolumeSpecName: "var-run") pod "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" (UID: "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231552 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-scripts\") pod \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231572 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp9z6\" (UniqueName: \"kubernetes.io/projected/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-kube-api-access-sp9z6\") pod \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231574 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" (UID: "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231682 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-lib\") pod \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\" (UID: \"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.231863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-lib" (OuterVolumeSpecName: "var-lib") pod "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" (UID: "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.232201 4732 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-lib\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.232215 4732 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-log\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.232223 4732 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.232232 4732 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-var-run\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.232727 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-scripts" (OuterVolumeSpecName: "scripts") pod "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" (UID: "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.239932 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-kube-api-access-sp9z6" (OuterVolumeSpecName: "kube-api-access-sp9z6") pod "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" (UID: "3cddfa4e-ec03-4651-ae7b-87b2dc3ec030"). InnerVolumeSpecName "kube-api-access-sp9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.333244 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.333281 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp9z6\" (UniqueName: \"kubernetes.io/projected/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030-kube-api-access-sp9z6\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.561536 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-n9v88_3cddfa4e-ec03-4651-ae7b-87b2dc3ec030/ovs-vswitchd/0.log" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.562807 4732 generic.go:334] "Generic (PLEG): container finished" podID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" exitCode=137 Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.562896 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-n9v88" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.562947 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9v88" event={"ID":"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030","Type":"ContainerDied","Data":"004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5"} Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.563257 4732 scope.go:117] "RemoveContainer" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.563135 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-n9v88" event={"ID":"3cddfa4e-ec03-4651-ae7b-87b2dc3ec030","Type":"ContainerDied","Data":"159a334e27f4cf5d4f8dc218da8d8aec72a3205eb7d266975ccfc7379aea7813"} Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.577668 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerID="6aa268d1067b6515b564fbd351c694b7f8bd27f2ca765a2e848302e1ec2da0ec" exitCode=137 Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.577792 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"6aa268d1067b6515b564fbd351c694b7f8bd27f2ca765a2e848302e1ec2da0ec"} Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.592656 4732 scope.go:117] "RemoveContainer" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.620197 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-n9v88"] Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.626914 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-n9v88"] Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.629333 4732 scope.go:117] "RemoveContainer" containerID="b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.654505 4732 scope.go:117] "RemoveContainer" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" Oct 10 07:13:55 crc kubenswrapper[4732]: E1010 07:13:55.654917 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5\": container with ID starting with 004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5 not found: ID does not exist" containerID="004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.654946 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5"} err="failed to get container status \"004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5\": rpc error: code = NotFound desc = could not find container \"004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5\": container with ID starting with 004b002852da359cb0635a2e0b2791e8ee16cbd8f9bc13e1cd6f238523d596d5 not found: ID does not exist" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.654966 4732 scope.go:117] "RemoveContainer" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" Oct 10 07:13:55 crc kubenswrapper[4732]: E1010 07:13:55.655461 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871\": container with ID starting with 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 not found: ID does not exist" containerID="0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.655513 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871"} err="failed to get container status \"0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871\": rpc error: code = NotFound desc = could not find container \"0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871\": container with ID starting with 0ce5c4565ebea29e27e384709ead56ce8fe77e5fe9d669d7881c698174ec7871 not found: ID does not exist" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.655547 4732 scope.go:117] "RemoveContainer" containerID="b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631" Oct 10 07:13:55 crc kubenswrapper[4732]: E1010 07:13:55.655847 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631\": container with ID starting with b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631 not found: ID does not exist" containerID="b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.655876 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631"} err="failed to get container status \"b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631\": rpc error: code = NotFound desc = could not find container \"b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631\": container with ID starting with b37716e90099a542a7fe45e70ed9a03fcc51ab35c9cc4dbff7498a1404b1f631 not found: ID does not exist" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.673648 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" path="/var/lib/kubelet/pods/3cddfa4e-ec03-4651-ae7b-87b2dc3ec030/volumes" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.743021 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.840287 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") pod \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.840558 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.840597 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pvkk\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-kube-api-access-7pvkk\") pod \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.840633 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-cache\") pod \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.840654 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-lock\") pod \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\" (UID: \"6ec5be94-f09a-4728-8858-c18fbd9ca2c2\") " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.841441 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-cache" (OuterVolumeSpecName: "cache") pod "6ec5be94-f09a-4728-8858-c18fbd9ca2c2" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.842021 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-lock" (OuterVolumeSpecName: "lock") pod "6ec5be94-f09a-4728-8858-c18fbd9ca2c2" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.845058 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-kube-api-access-7pvkk" (OuterVolumeSpecName: "kube-api-access-7pvkk") pod "6ec5be94-f09a-4728-8858-c18fbd9ca2c2" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2"). InnerVolumeSpecName "kube-api-access-7pvkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.845771 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6ec5be94-f09a-4728-8858-c18fbd9ca2c2" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.846235 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "swift") pod "6ec5be94-f09a-4728-8858-c18fbd9ca2c2" (UID: "6ec5be94-f09a-4728-8858-c18fbd9ca2c2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.941949 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.942014 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.942029 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pvkk\" (UniqueName: \"kubernetes.io/projected/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-kube-api-access-7pvkk\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.942042 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-cache\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.942052 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ec5be94-f09a-4728-8858-c18fbd9ca2c2-lock\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:55 crc kubenswrapper[4732]: I1010 07:13:55.963930 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.042887 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.601375 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ec5be94-f09a-4728-8858-c18fbd9ca2c2","Type":"ContainerDied","Data":"5c30c7cb3461534a3149c9cdc991691349664994d1ba24bf0904fe56a2f748d8"} Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.601452 4732 scope.go:117] "RemoveContainer" containerID="6aa268d1067b6515b564fbd351c694b7f8bd27f2ca765a2e848302e1ec2da0ec" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.601494 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.630097 4732 scope.go:117] "RemoveContainer" containerID="7c44462876be789a8e5caeabb0625c49ae5413ec6663dae73e6b157a5e977d76" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.640642 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.647187 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.676500 4732 scope.go:117] "RemoveContainer" containerID="c875be356f22174bd7fe912809d07ce631dcb17edd6d1d6aabc340d517cc6551" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.700391 4732 scope.go:117] "RemoveContainer" containerID="7740e27fefba27d5e80df5ff662cfd5fc4b86c96b608fa32c24f8d2b25cee4a2" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.722854 4732 scope.go:117] "RemoveContainer" containerID="e2917273d26b808e5a8fc08c8152f588e5014472d4e7a647ebdbecedcba84fda" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.746110 4732 scope.go:117] "RemoveContainer" containerID="84783f363dda1053c7f032969b8a9b632ff711d6f0764371f0d881ee3ad20516" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.763353 4732 scope.go:117] "RemoveContainer" containerID="d4e49bf9ad485c0fe0bbb4a2dbc2f08f31e1f3158c54e7e7a0fa81f3f0046870" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.786405 4732 scope.go:117] "RemoveContainer" containerID="43a213f53856bce5c190f44e7458e042262da79e1784f2045dda4e75dc3471b6" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.807019 4732 scope.go:117] "RemoveContainer" containerID="bcadb525584dba5a9a1af302bfd2be19ff703a8c744b55dbf166f43746dfd5fa" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.823259 4732 scope.go:117] "RemoveContainer" containerID="ca80afd8ea95c25e8f07db4e28d154c1d53a72ce3f36789ae3ff9af29cf3a561" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.842253 4732 scope.go:117] "RemoveContainer" containerID="4577435a75c7166b15559759271a9948adb5a88482a2db26d6c48d48b9208d39" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.859901 4732 scope.go:117] "RemoveContainer" containerID="306e993d7c42ab68be7b6186fb51f97b059b5a7bcc1a130f1b6cecbe5bae570f" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.882145 4732 scope.go:117] "RemoveContainer" containerID="04a8993decfa9c602d76b910a4eb75f9a4b7db875ca6bfa209f16244327dbd1a" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.900337 4732 scope.go:117] "RemoveContainer" containerID="440271d54f3b94f368b668f0086f762ecb8f963317d10585e119ad50bf50d796" Oct 10 07:13:56 crc kubenswrapper[4732]: I1010 07:13:56.921539 4732 scope.go:117] "RemoveContainer" containerID="aea23c87d7a0e1648589bbcd40543c7d5e8ccf5a80b3a896677fc3b317ec2dda" Oct 10 07:13:57 crc kubenswrapper[4732]: I1010 07:13:57.674361 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" path="/var/lib/kubelet/pods/6ec5be94-f09a-4728-8858-c18fbd9ca2c2/volumes" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.509458 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand0a1-account-delete-5xtdn" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.602144 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder62ae-account-delete-jd7lb" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.607738 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancedb70-account-delete-6srhp" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.611374 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph99j\" (UniqueName: \"kubernetes.io/projected/cb766f51-b132-4979-b32e-a2cfcb3edb50-kube-api-access-ph99j\") pod \"cb766f51-b132-4979-b32e-a2cfcb3edb50\" (UID: \"cb766f51-b132-4979-b32e-a2cfcb3edb50\") " Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.618562 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb766f51-b132-4979-b32e-a2cfcb3edb50-kube-api-access-ph99j" (OuterVolumeSpecName: "kube-api-access-ph99j") pod "cb766f51-b132-4979-b32e-a2cfcb3edb50" (UID: "cb766f51-b132-4979-b32e-a2cfcb3edb50"). InnerVolumeSpecName "kube-api-access-ph99j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.641269 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutrone3b5-account-delete-dtqm9" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.642390 4732 generic.go:334] "Generic (PLEG): container finished" podID="cb766f51-b132-4979-b32e-a2cfcb3edb50" containerID="2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a" exitCode=137 Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.642444 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicand0a1-account-delete-5xtdn" event={"ID":"cb766f51-b132-4979-b32e-a2cfcb3edb50","Type":"ContainerDied","Data":"2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.642470 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicand0a1-account-delete-5xtdn" event={"ID":"cb766f51-b132-4979-b32e-a2cfcb3edb50","Type":"ContainerDied","Data":"dc6c82fe6ffb2f5e067bd499ba53169442c73bc301cb9f5eae59512ac1b0a5d1"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.642489 4732 scope.go:117] "RemoveContainer" containerID="2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.642605 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicand0a1-account-delete-5xtdn" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.645635 4732 generic.go:334] "Generic (PLEG): container finished" podID="03efe727-1f84-49e0-b6cb-a7189a02ba76" containerID="d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57" exitCode=137 Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.645740 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancedb70-account-delete-6srhp" event={"ID":"03efe727-1f84-49e0-b6cb-a7189a02ba76","Type":"ContainerDied","Data":"d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.645768 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glancedb70-account-delete-6srhp" event={"ID":"03efe727-1f84-49e0-b6cb-a7189a02ba76","Type":"ContainerDied","Data":"f9d07bdbcdc231e9a236d5fc89c80e8824bf1161f878aa426bf13bbebfb4a25f"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.645816 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glancedb70-account-delete-6srhp" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.648773 4732 generic.go:334] "Generic (PLEG): container finished" podID="80309f7c-d137-4116-a447-c9749c27c669" containerID="188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63" exitCode=137 Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.648811 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder62ae-account-delete-jd7lb" event={"ID":"80309f7c-d137-4116-a447-c9749c27c669","Type":"ContainerDied","Data":"188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.648825 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder62ae-account-delete-jd7lb" event={"ID":"80309f7c-d137-4116-a447-c9749c27c669","Type":"ContainerDied","Data":"427cac54f3478cf3880b320d223ca4b0983088f7b7f2630092e6d598ae35eca6"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.648855 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder62ae-account-delete-jd7lb" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.651849 4732 generic.go:334] "Generic (PLEG): container finished" podID="5c751e0c-75c7-4aaf-bf32-55e6d022d802" containerID="cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508" exitCode=137 Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.651872 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrone3b5-account-delete-dtqm9" event={"ID":"5c751e0c-75c7-4aaf-bf32-55e6d022d802","Type":"ContainerDied","Data":"cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.651894 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutrone3b5-account-delete-dtqm9" event={"ID":"5c751e0c-75c7-4aaf-bf32-55e6d022d802","Type":"ContainerDied","Data":"71fc0b1cd366e38052793e0669137aa6d7ece8fc603eecabfae54a5b05f0a322"} Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.651863 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutrone3b5-account-delete-dtqm9" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.664183 4732 scope.go:117] "RemoveContainer" containerID="2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a" Oct 10 07:14:00 crc kubenswrapper[4732]: E1010 07:14:00.667479 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a\": container with ID starting with 2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a not found: ID does not exist" containerID="2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.667516 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a"} err="failed to get container status \"2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a\": rpc error: code = NotFound desc = could not find container \"2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a\": container with ID starting with 2e9eef9d983a473a01ae40ceb3e7a5a54fc51a290a721c03cf773e2a91232f9a not found: ID does not exist" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.667536 4732 scope.go:117] "RemoveContainer" containerID="d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.680766 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicand0a1-account-delete-5xtdn"] Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.689796 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicand0a1-account-delete-5xtdn"] Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.692815 4732 scope.go:117] "RemoveContainer" containerID="d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57" Oct 10 07:14:00 crc kubenswrapper[4732]: E1010 07:14:00.693206 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57\": container with ID starting with d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57 not found: ID does not exist" containerID="d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.693238 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57"} err="failed to get container status \"d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57\": rpc error: code = NotFound desc = could not find container \"d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57\": container with ID starting with d17a3ba2a0fe9d5c439b17dda0bcce06dba794d70fe7371830981023916e9e57 not found: ID does not exist" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.693259 4732 scope.go:117] "RemoveContainer" containerID="188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.709372 4732 scope.go:117] "RemoveContainer" containerID="188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63" Oct 10 07:14:00 crc kubenswrapper[4732]: E1010 07:14:00.709708 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63\": container with ID starting with 188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63 not found: ID does not exist" containerID="188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.709754 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63"} err="failed to get container status \"188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63\": rpc error: code = NotFound desc = could not find container \"188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63\": container with ID starting with 188dec5f6031832ef9110a92dabc0a3c25cff58d031eb07b66c5f3d6ccfecb63 not found: ID does not exist" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.709818 4732 scope.go:117] "RemoveContainer" containerID="cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.712255 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wzq9\" (UniqueName: \"kubernetes.io/projected/80309f7c-d137-4116-a447-c9749c27c669-kube-api-access-4wzq9\") pod \"80309f7c-d137-4116-a447-c9749c27c669\" (UID: \"80309f7c-d137-4116-a447-c9749c27c669\") " Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.712313 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxpgm\" (UniqueName: \"kubernetes.io/projected/5c751e0c-75c7-4aaf-bf32-55e6d022d802-kube-api-access-gxpgm\") pod \"5c751e0c-75c7-4aaf-bf32-55e6d022d802\" (UID: \"5c751e0c-75c7-4aaf-bf32-55e6d022d802\") " Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.712357 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tvrl\" (UniqueName: \"kubernetes.io/projected/03efe727-1f84-49e0-b6cb-a7189a02ba76-kube-api-access-7tvrl\") pod \"03efe727-1f84-49e0-b6cb-a7189a02ba76\" (UID: \"03efe727-1f84-49e0-b6cb-a7189a02ba76\") " Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.712570 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph99j\" (UniqueName: \"kubernetes.io/projected/cb766f51-b132-4979-b32e-a2cfcb3edb50-kube-api-access-ph99j\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.714575 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80309f7c-d137-4116-a447-c9749c27c669-kube-api-access-4wzq9" (OuterVolumeSpecName: "kube-api-access-4wzq9") pod "80309f7c-d137-4116-a447-c9749c27c669" (UID: "80309f7c-d137-4116-a447-c9749c27c669"). InnerVolumeSpecName "kube-api-access-4wzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.714990 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03efe727-1f84-49e0-b6cb-a7189a02ba76-kube-api-access-7tvrl" (OuterVolumeSpecName: "kube-api-access-7tvrl") pod "03efe727-1f84-49e0-b6cb-a7189a02ba76" (UID: "03efe727-1f84-49e0-b6cb-a7189a02ba76"). InnerVolumeSpecName "kube-api-access-7tvrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.715262 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c751e0c-75c7-4aaf-bf32-55e6d022d802-kube-api-access-gxpgm" (OuterVolumeSpecName: "kube-api-access-gxpgm") pod "5c751e0c-75c7-4aaf-bf32-55e6d022d802" (UID: "5c751e0c-75c7-4aaf-bf32-55e6d022d802"). InnerVolumeSpecName "kube-api-access-gxpgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.724063 4732 scope.go:117] "RemoveContainer" containerID="cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508" Oct 10 07:14:00 crc kubenswrapper[4732]: E1010 07:14:00.724450 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508\": container with ID starting with cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508 not found: ID does not exist" containerID="cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.724511 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508"} err="failed to get container status \"cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508\": rpc error: code = NotFound desc = could not find container \"cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508\": container with ID starting with cf690e392d9ef7453c1a5fd5c13518a50783cdd804e5dc5eb6eb6e35b45d3508 not found: ID does not exist" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.814489 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wzq9\" (UniqueName: \"kubernetes.io/projected/80309f7c-d137-4116-a447-c9749c27c669-kube-api-access-4wzq9\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.814547 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxpgm\" (UniqueName: \"kubernetes.io/projected/5c751e0c-75c7-4aaf-bf32-55e6d022d802-kube-api-access-gxpgm\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.814567 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tvrl\" (UniqueName: \"kubernetes.io/projected/03efe727-1f84-49e0-b6cb-a7189a02ba76-kube-api-access-7tvrl\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.983629 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glancedb70-account-delete-6srhp"] Oct 10 07:14:00 crc kubenswrapper[4732]: I1010 07:14:00.989871 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glancedb70-account-delete-6srhp"] Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.004493 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder62ae-account-delete-jd7lb"] Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.016752 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder62ae-account-delete-jd7lb"] Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.030063 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutrone3b5-account-delete-dtqm9"] Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.040291 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutrone3b5-account-delete-dtqm9"] Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.604462 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0b57-account-delete-95cjw" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.656311 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00878-account-delete-pjh75" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.661536 4732 generic.go:334] "Generic (PLEG): container finished" podID="1f7ba305-07fd-408c-865f-463e3738e6cb" containerID="bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7" exitCode=137 Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.661628 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell00878-account-delete-pjh75" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.685394 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03efe727-1f84-49e0-b6cb-a7189a02ba76" path="/var/lib/kubelet/pods/03efe727-1f84-49e0-b6cb-a7189a02ba76/volumes" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.686403 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c751e0c-75c7-4aaf-bf32-55e6d022d802" path="/var/lib/kubelet/pods/5c751e0c-75c7-4aaf-bf32-55e6d022d802/volumes" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.687093 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80309f7c-d137-4116-a447-c9749c27c669" path="/var/lib/kubelet/pods/80309f7c-d137-4116-a447-c9749c27c669/volumes" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.687851 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb766f51-b132-4979-b32e-a2cfcb3edb50" path="/var/lib/kubelet/pods/cb766f51-b132-4979-b32e-a2cfcb3edb50/volumes" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.688998 4732 generic.go:334] "Generic (PLEG): container finished" podID="c76de706-34bc-4b37-8492-3573c19e91c2" containerID="abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57" exitCode=137 Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.689143 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi0b57-account-delete-95cjw" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.702168 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00878-account-delete-pjh75" event={"ID":"1f7ba305-07fd-408c-865f-463e3738e6cb","Type":"ContainerDied","Data":"bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7"} Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.702231 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell00878-account-delete-pjh75" event={"ID":"1f7ba305-07fd-408c-865f-463e3738e6cb","Type":"ContainerDied","Data":"e6e45d2c8e155eccba8eac7001fad86d0afbc66d286de286a6da54351d909af9"} Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.702251 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi0b57-account-delete-95cjw" event={"ID":"c76de706-34bc-4b37-8492-3573c19e91c2","Type":"ContainerDied","Data":"abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57"} Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.702267 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi0b57-account-delete-95cjw" event={"ID":"c76de706-34bc-4b37-8492-3573c19e91c2","Type":"ContainerDied","Data":"e85918946a4cd07b24a7309533bb5dce5941629dbd2282efb0cae6d4d9dae54f"} Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.702289 4732 scope.go:117] "RemoveContainer" containerID="bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.726423 4732 scope.go:117] "RemoveContainer" containerID="bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.726684 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkrt8\" (UniqueName: \"kubernetes.io/projected/c76de706-34bc-4b37-8492-3573c19e91c2-kube-api-access-xkrt8\") pod \"c76de706-34bc-4b37-8492-3573c19e91c2\" (UID: \"c76de706-34bc-4b37-8492-3573c19e91c2\") " Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.726738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxkfm\" (UniqueName: \"kubernetes.io/projected/1f7ba305-07fd-408c-865f-463e3738e6cb-kube-api-access-pxkfm\") pod \"1f7ba305-07fd-408c-865f-463e3738e6cb\" (UID: \"1f7ba305-07fd-408c-865f-463e3738e6cb\") " Oct 10 07:14:01 crc kubenswrapper[4732]: E1010 07:14:01.727447 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7\": container with ID starting with bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7 not found: ID does not exist" containerID="bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.727480 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7"} err="failed to get container status \"bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7\": rpc error: code = NotFound desc = could not find container \"bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7\": container with ID starting with bde72e0d01527bbb933312766330c52eac066d13e956e90fba91a3ac916f1ab7 not found: ID does not exist" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.727501 4732 scope.go:117] "RemoveContainer" containerID="abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.731814 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7ba305-07fd-408c-865f-463e3738e6cb-kube-api-access-pxkfm" (OuterVolumeSpecName: "kube-api-access-pxkfm") pod "1f7ba305-07fd-408c-865f-463e3738e6cb" (UID: "1f7ba305-07fd-408c-865f-463e3738e6cb"). InnerVolumeSpecName "kube-api-access-pxkfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.731861 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76de706-34bc-4b37-8492-3573c19e91c2-kube-api-access-xkrt8" (OuterVolumeSpecName: "kube-api-access-xkrt8") pod "c76de706-34bc-4b37-8492-3573c19e91c2" (UID: "c76de706-34bc-4b37-8492-3573c19e91c2"). InnerVolumeSpecName "kube-api-access-xkrt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.743797 4732 scope.go:117] "RemoveContainer" containerID="abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57" Oct 10 07:14:01 crc kubenswrapper[4732]: E1010 07:14:01.744282 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57\": container with ID starting with abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57 not found: ID does not exist" containerID="abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.744318 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57"} err="failed to get container status \"abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57\": rpc error: code = NotFound desc = could not find container \"abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57\": container with ID starting with abec16288c03a52543f26222efbda5c425e8cd9fb03fd57c813db9d422405c57 not found: ID does not exist" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.828228 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkrt8\" (UniqueName: \"kubernetes.io/projected/c76de706-34bc-4b37-8492-3573c19e91c2-kube-api-access-xkrt8\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:01 crc kubenswrapper[4732]: I1010 07:14:01.828474 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxkfm\" (UniqueName: \"kubernetes.io/projected/1f7ba305-07fd-408c-865f-463e3738e6cb-kube-api-access-pxkfm\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:02 crc kubenswrapper[4732]: I1010 07:14:02.021811 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell00878-account-delete-pjh75"] Oct 10 07:14:02 crc kubenswrapper[4732]: I1010 07:14:02.035519 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell00878-account-delete-pjh75"] Oct 10 07:14:02 crc kubenswrapper[4732]: I1010 07:14:02.040685 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi0b57-account-delete-95cjw"] Oct 10 07:14:02 crc kubenswrapper[4732]: I1010 07:14:02.044893 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi0b57-account-delete-95cjw"] Oct 10 07:14:03 crc kubenswrapper[4732]: I1010 07:14:03.677184 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f7ba305-07fd-408c-865f-463e3738e6cb" path="/var/lib/kubelet/pods/1f7ba305-07fd-408c-865f-463e3738e6cb/volumes" Oct 10 07:14:03 crc kubenswrapper[4732]: I1010 07:14:03.678728 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76de706-34bc-4b37-8492-3573c19e91c2" path="/var/lib/kubelet/pods/c76de706-34bc-4b37-8492-3573c19e91c2/volumes" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.229889 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ftxn"] Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.230977 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231001 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231026 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231041 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231064 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d80b654-a26e-46ea-84f4-264c3c883250" containerName="nova-cell1-conductor-conductor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231079 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d80b654-a26e-46ea-84f4-264c3c883250" containerName="nova-cell1-conductor-conductor" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231109 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerName="galera" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231122 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerName="galera" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231142 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="cinder-scheduler" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231154 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="cinder-scheduler" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231176 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231190 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231208 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="probe" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231220 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="probe" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231242 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231255 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-api" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231269 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea62a47-1d15-41a2-a0d0-a0456a46183a" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231283 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea62a47-1d15-41a2-a0d0-a0456a46183a" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231299 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231312 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231332 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231347 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231368 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231382 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231404 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" containerName="kube-state-metrics" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231417 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" containerName="kube-state-metrics" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231430 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76de706-34bc-4b37-8492-3573c19e91c2" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231445 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76de706-34bc-4b37-8492-3573c19e91c2" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231469 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="rsync" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231481 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="rsync" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231503 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="ovsdbserver-sb" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231516 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="ovsdbserver-sb" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231533 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231546 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231566 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-central-agent" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231579 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-central-agent" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231598 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="ovsdbserver-nb" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231612 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="ovsdbserver-nb" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231631 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231643 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231663 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="proxy-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231676 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="proxy-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231721 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231735 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-server" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231752 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bb7b78-cc62-4d3b-a33a-9af77ee9e141" containerName="nova-cell0-conductor-conductor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231766 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bb7b78-cc62-4d3b-a33a-9af77ee9e141" containerName="nova-cell0-conductor-conductor" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231785 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c7a322-6bdd-4613-9a25-39391becbb81" containerName="nova-scheduler-scheduler" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231798 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c7a322-6bdd-4613-9a25-39391becbb81" containerName="nova-scheduler-scheduler" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231811 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-expirer" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231824 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-expirer" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231848 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-updater" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231860 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-updater" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231884 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231898 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231921 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231934 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231949 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-updater" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231962 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-updater" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.231977 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-notification-agent" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.231990 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-notification-agent" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232009 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232022 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232049 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232062 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-server" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232084 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232097 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-api" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232115 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232128 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232151 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a05e0a-f30f-4b7c-b939-eba8d0094d48" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232164 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a05e0a-f30f-4b7c-b939-eba8d0094d48" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232188 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerName="dnsmasq-dns" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232200 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerName="dnsmasq-dns" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232215 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80309f7c-d137-4116-a447-c9749c27c669" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232228 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="80309f7c-d137-4116-a447-c9749c27c669" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232253 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232266 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232284 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232297 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-server" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232316 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93e689a-691a-403b-970f-63547469bbfe" containerName="keystone-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232329 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93e689a-691a-403b-970f-63547469bbfe" containerName="keystone-api" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232345 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-reaper" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232433 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-reaper" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232461 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232475 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232496 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232509 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232526 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232539 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-server" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232563 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerName="setup-container" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232576 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerName="setup-container" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232599 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerName="mysql-bootstrap" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232612 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerName="mysql-bootstrap" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232631 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="swift-recon-cron" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232646 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="swift-recon-cron" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232667 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232680 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232722 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232735 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232760 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232773 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232788 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232801 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232821 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232835 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232857 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232869 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232883 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="sg-core" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232895 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="sg-core" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232912 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232925 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232948 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerName="galera" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232961 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerName="galera" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.232979 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab930cd4-caad-4980-a491-8f6c5abca8bf" containerName="memcached" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.232991 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab930cd4-caad-4980-a491-8f6c5abca8bf" containerName="memcached" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233008 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server-init" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233021 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server-init" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233039 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233051 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-api" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233075 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233087 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-log" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233105 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="ovn-northd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233118 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="ovn-northd" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233137 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233150 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233172 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb766f51-b132-4979-b32e-a2cfcb3edb50" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233185 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb766f51-b132-4979-b32e-a2cfcb3edb50" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233205 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565f831c-0da8-4481-8461-8522e0cfa801" containerName="rabbitmq" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233218 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="565f831c-0da8-4481-8461-8522e0cfa801" containerName="rabbitmq" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233232 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7ba305-07fd-408c-865f-463e3738e6cb" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233244 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7ba305-07fd-408c-865f-463e3738e6cb" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233262 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="565f831c-0da8-4481-8461-8522e0cfa801" containerName="setup-container" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233276 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="565f831c-0da8-4481-8461-8522e0cfa801" containerName="setup-container" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233290 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233302 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233331 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233344 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233363 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-metadata" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233376 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-metadata" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233395 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerName="rabbitmq" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233407 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerName="rabbitmq" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233420 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerName="mysql-bootstrap" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233433 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerName="mysql-bootstrap" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233456 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerName="init" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233468 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerName="init" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233486 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233499 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233520 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710f9fa6-588e-4226-a65d-5220d0a1f315" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233532 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="710f9fa6-588e-4226-a65d-5220d0a1f315" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233553 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c751e0c-75c7-4aaf-bf32-55e6d022d802" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233566 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c751e0c-75c7-4aaf-bf32-55e6d022d802" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233585 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03efe727-1f84-49e0-b6cb-a7189a02ba76" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233598 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="03efe727-1f84-49e0-b6cb-a7189a02ba76" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233620 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233633 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233648 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233661 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: E1010 07:14:25.233676 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233726 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.233986 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-reaper" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234012 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="710f9fa6-588e-4226-a65d-5220d0a1f315" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234033 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="proxy-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234047 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234072 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234095 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234115 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234132 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234156 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234177 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="03efe727-1f84-49e0-b6cb-a7189a02ba76" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234199 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c751e0c-75c7-4aaf-bf32-55e6d022d802" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234224 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c76de27-f32a-47ec-ba19-1b8e7a5e6be5" containerName="dnsmasq-dns" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234249 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234271 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbaa5798-1d07-445a-a226-ad48054d3dbc" containerName="galera" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234293 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56077f87-ea67-4080-b328-7186a7d0bf35" containerName="nova-api-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234316 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-central-agent" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234336 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a570b39e-7329-4bba-bfe0-cf5f7aa2269e" containerName="glance-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234360 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-auditor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234374 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234389 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a05e0a-f30f-4b7c-b939-eba8d0094d48" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234413 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab930cd4-caad-4980-a491-8f6c5abca8bf" containerName="memcached" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234432 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234446 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f34ab2c-f804-4f24-a447-165d5afb984f" containerName="nova-metadata-metadata" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234465 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb766f51-b132-4979-b32e-a2cfcb3edb50" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234480 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234502 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a11668-5ab6-4b77-8bb7-ac60140f4bd4" containerName="rabbitmq" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234522 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234545 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="account-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234561 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="cinder-scheduler" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234576 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovsdb-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234593 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="ovn-northd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234613 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234627 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-server" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234642 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-expirer" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234656 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93e689a-691a-403b-970f-63547469bbfe" containerName="keystone-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234672 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234725 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7ba305-07fd-408c-865f-463e3738e6cb" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234741 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-replicator" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234757 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cddfa4e-ec03-4651-ae7b-87b2dc3ec030" containerName="ovs-vswitchd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234771 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76de706-34bc-4b37-8492-3573c19e91c2" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234794 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ed9ac89-1c81-4b22-b914-1c8bfcacbfa8" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234813 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="rsync" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234830 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234852 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="99f1b967-cc4d-4092-87e9-64cbbc84be27" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234873 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234890 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb179b69-8c25-49b1-88b5-6c17953ffbcd" containerName="cinder-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234909 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5d96c35-c01e-4f12-ab12-7b6342789b2f" containerName="barbican-keystone-listener" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234932 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b69405-bc4b-4e39-be49-0a966bc649bb" containerName="probe" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234951 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6dedf8-3428-4444-86f9-4f25c0b916e3" containerName="ovsdbserver-nb" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234972 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="80309f7c-d137-4116-a447-c9749c27c669" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.234996 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cac2fd-b97b-4cf2-b57f-dea34fc6f4a2" containerName="glance-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235014 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="ceilometer-notification-agent" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235034 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7daaf3e5-82f0-45f7-aa22-40be65433320" containerName="proxy-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235052 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c7a322-6bdd-4613-9a25-39391becbb81" containerName="nova-scheduler-scheduler" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235069 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235089 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="63706a24-ebfd-45ae-96b0-49ab7bd13fdf" containerName="galera" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235107 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="swift-recon-cron" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235126 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea62a47-1d15-41a2-a0d0-a0456a46183a" containerName="mariadb-account-delete" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235145 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb94a64c-1a0c-4a61-bb69-e843b627cf35" containerName="neutron-httpd" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235162 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="13bb7b78-cc62-4d3b-a33a-9af77ee9e141" containerName="nova-cell0-conductor-conductor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235179 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="openstack-network-exporter" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235198 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8c3140a-2ab2-44f7-9ddd-73de883c4b65" containerName="ovn-controller" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235218 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-api" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235235 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2960d902-25b0-4fb8-baa7-fe7f9d4f5811" containerName="barbican-api-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235249 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fa844d-f411-49a9-a52f-256760a71157" containerName="barbican-worker" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235265 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="565f831c-0da8-4481-8461-8522e0cfa801" containerName="rabbitmq" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235284 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e37998e-491a-43b8-abda-4bdfea233217" containerName="placement-log" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235298 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed592ee3-6dab-41d4-8141-bb7c31b02f73" containerName="sg-core" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235320 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e6ee2c-4b3b-48af-860d-f23aea3c4c85" containerName="kube-state-metrics" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235334 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="container-updater" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235355 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec5be94-f09a-4728-8858-c18fbd9ca2c2" containerName="object-updater" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235374 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="64dcf265-8f29-46bc-9b03-40dda51f606b" containerName="ovsdbserver-sb" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.235391 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d80b654-a26e-46ea-84f4-264c3c883250" containerName="nova-cell1-conductor-conductor" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.238229 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.260151 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ftxn"] Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.286924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rstt\" (UniqueName: \"kubernetes.io/projected/c69445af-d8ff-4fbd-b4a7-405a566c9c87-kube-api-access-6rstt\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.286997 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-utilities\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.287090 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-catalog-content\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.388788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-catalog-content\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.389165 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rstt\" (UniqueName: \"kubernetes.io/projected/c69445af-d8ff-4fbd-b4a7-405a566c9c87-kube-api-access-6rstt\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.389305 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-utilities\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.389369 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-catalog-content\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.390008 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-utilities\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.412560 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rstt\" (UniqueName: \"kubernetes.io/projected/c69445af-d8ff-4fbd-b4a7-405a566c9c87-kube-api-access-6rstt\") pod \"redhat-operators-6ftxn\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:25 crc kubenswrapper[4732]: I1010 07:14:25.579534 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:26 crc kubenswrapper[4732]: I1010 07:14:26.042715 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ftxn"] Oct 10 07:14:26 crc kubenswrapper[4732]: I1010 07:14:26.974828 4732 generic.go:334] "Generic (PLEG): container finished" podID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerID="0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3" exitCode=0 Oct 10 07:14:26 crc kubenswrapper[4732]: I1010 07:14:26.974929 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ftxn" event={"ID":"c69445af-d8ff-4fbd-b4a7-405a566c9c87","Type":"ContainerDied","Data":"0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3"} Oct 10 07:14:26 crc kubenswrapper[4732]: I1010 07:14:26.978739 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:14:26 crc kubenswrapper[4732]: I1010 07:14:26.980159 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ftxn" event={"ID":"c69445af-d8ff-4fbd-b4a7-405a566c9c87","Type":"ContainerStarted","Data":"936aaeb32def121d39f5983a042f8d81e41ed6e3abf60f6d8f5e86b1babcdd22"} Oct 10 07:14:27 crc kubenswrapper[4732]: I1010 07:14:27.992681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ftxn" event={"ID":"c69445af-d8ff-4fbd-b4a7-405a566c9c87","Type":"ContainerStarted","Data":"9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341"} Oct 10 07:14:29 crc kubenswrapper[4732]: I1010 07:14:29.005271 4732 generic.go:334] "Generic (PLEG): container finished" podID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerID="9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341" exitCode=0 Oct 10 07:14:29 crc kubenswrapper[4732]: I1010 07:14:29.005332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ftxn" event={"ID":"c69445af-d8ff-4fbd-b4a7-405a566c9c87","Type":"ContainerDied","Data":"9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341"} Oct 10 07:14:30 crc kubenswrapper[4732]: I1010 07:14:30.018664 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ftxn" event={"ID":"c69445af-d8ff-4fbd-b4a7-405a566c9c87","Type":"ContainerStarted","Data":"e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777"} Oct 10 07:14:30 crc kubenswrapper[4732]: I1010 07:14:30.044887 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ftxn" podStartSLOduration=2.499489826 podStartE2EDuration="5.044868976s" podCreationTimestamp="2025-10-10 07:14:25 +0000 UTC" firstStartedPulling="2025-10-10 07:14:26.978170497 +0000 UTC m=+1394.047761778" lastFinishedPulling="2025-10-10 07:14:29.523549637 +0000 UTC m=+1396.593140928" observedRunningTime="2025-10-10 07:14:30.042034399 +0000 UTC m=+1397.111625660" watchObservedRunningTime="2025-10-10 07:14:30.044868976 +0000 UTC m=+1397.114460227" Oct 10 07:14:35 crc kubenswrapper[4732]: I1010 07:14:35.580218 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:35 crc kubenswrapper[4732]: I1010 07:14:35.580841 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:36 crc kubenswrapper[4732]: I1010 07:14:36.663253 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6ftxn" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="registry-server" probeResult="failure" output=< Oct 10 07:14:36 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 07:14:36 crc kubenswrapper[4732]: > Oct 10 07:14:45 crc kubenswrapper[4732]: I1010 07:14:45.644425 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:45 crc kubenswrapper[4732]: I1010 07:14:45.695288 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:45 crc kubenswrapper[4732]: I1010 07:14:45.880067 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ftxn"] Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.199841 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6ftxn" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="registry-server" containerID="cri-o://e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777" gracePeriod=2 Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.620169 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.746628 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-utilities\") pod \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.746693 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-catalog-content\") pod \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.747071 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rstt\" (UniqueName: \"kubernetes.io/projected/c69445af-d8ff-4fbd-b4a7-405a566c9c87-kube-api-access-6rstt\") pod \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\" (UID: \"c69445af-d8ff-4fbd-b4a7-405a566c9c87\") " Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.748337 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-utilities" (OuterVolumeSpecName: "utilities") pod "c69445af-d8ff-4fbd-b4a7-405a566c9c87" (UID: "c69445af-d8ff-4fbd-b4a7-405a566c9c87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.751977 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69445af-d8ff-4fbd-b4a7-405a566c9c87-kube-api-access-6rstt" (OuterVolumeSpecName: "kube-api-access-6rstt") pod "c69445af-d8ff-4fbd-b4a7-405a566c9c87" (UID: "c69445af-d8ff-4fbd-b4a7-405a566c9c87"). InnerVolumeSpecName "kube-api-access-6rstt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.848911 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rstt\" (UniqueName: \"kubernetes.io/projected/c69445af-d8ff-4fbd-b4a7-405a566c9c87-kube-api-access-6rstt\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.848970 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.861266 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c69445af-d8ff-4fbd-b4a7-405a566c9c87" (UID: "c69445af-d8ff-4fbd-b4a7-405a566c9c87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:14:47 crc kubenswrapper[4732]: I1010 07:14:47.950284 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c69445af-d8ff-4fbd-b4a7-405a566c9c87-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.214527 4732 generic.go:334] "Generic (PLEG): container finished" podID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerID="e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777" exitCode=0 Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.214571 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ftxn" event={"ID":"c69445af-d8ff-4fbd-b4a7-405a566c9c87","Type":"ContainerDied","Data":"e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777"} Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.214602 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ftxn" event={"ID":"c69445af-d8ff-4fbd-b4a7-405a566c9c87","Type":"ContainerDied","Data":"936aaeb32def121d39f5983a042f8d81e41ed6e3abf60f6d8f5e86b1babcdd22"} Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.214602 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ftxn" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.214620 4732 scope.go:117] "RemoveContainer" containerID="e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.250649 4732 scope.go:117] "RemoveContainer" containerID="9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.254179 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ftxn"] Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.258623 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6ftxn"] Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.278372 4732 scope.go:117] "RemoveContainer" containerID="0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.317492 4732 scope.go:117] "RemoveContainer" containerID="e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777" Oct 10 07:14:48 crc kubenswrapper[4732]: E1010 07:14:48.317975 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777\": container with ID starting with e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777 not found: ID does not exist" containerID="e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.318029 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777"} err="failed to get container status \"e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777\": rpc error: code = NotFound desc = could not find container \"e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777\": container with ID starting with e8391ac05fa8c2b70152ffe43e5ec7934dec2ba165ee92a2432c44d51a76b777 not found: ID does not exist" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.318061 4732 scope.go:117] "RemoveContainer" containerID="9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341" Oct 10 07:14:48 crc kubenswrapper[4732]: E1010 07:14:48.318383 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341\": container with ID starting with 9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341 not found: ID does not exist" containerID="9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.318415 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341"} err="failed to get container status \"9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341\": rpc error: code = NotFound desc = could not find container \"9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341\": container with ID starting with 9271d124dafc4d4697d9f16f23fec8f397c16297a69a7d47a07d7486812ca341 not found: ID does not exist" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.318443 4732 scope.go:117] "RemoveContainer" containerID="0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3" Oct 10 07:14:48 crc kubenswrapper[4732]: E1010 07:14:48.318727 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3\": container with ID starting with 0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3 not found: ID does not exist" containerID="0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3" Oct 10 07:14:48 crc kubenswrapper[4732]: I1010 07:14:48.318757 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3"} err="failed to get container status \"0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3\": rpc error: code = NotFound desc = could not find container \"0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3\": container with ID starting with 0d5e81155ee2b976abca0c9aaeb43122be9b6ca96bfa8b14c6603262eb2dd7d3 not found: ID does not exist" Oct 10 07:14:49 crc kubenswrapper[4732]: I1010 07:14:49.671824 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" path="/var/lib/kubelet/pods/c69445af-d8ff-4fbd-b4a7-405a566c9c87/volumes" Oct 10 07:14:55 crc kubenswrapper[4732]: I1010 07:14:55.356133 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:14:55 crc kubenswrapper[4732]: I1010 07:14:55.356770 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.175730 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj"] Oct 10 07:15:00 crc kubenswrapper[4732]: E1010 07:15:00.177325 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.177461 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="extract-utilities" Oct 10 07:15:00 crc kubenswrapper[4732]: E1010 07:15:00.177581 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.177674 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="extract-content" Oct 10 07:15:00 crc kubenswrapper[4732]: E1010 07:15:00.177843 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.180064 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.180681 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69445af-d8ff-4fbd-b4a7-405a566c9c87" containerName="registry-server" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.182903 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.187636 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.189165 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.201485 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj"] Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.262890 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nczq\" (UniqueName: \"kubernetes.io/projected/0e4a69df-b47c-4d0f-b438-11b3be02eabb-kube-api-access-7nczq\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.262973 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e4a69df-b47c-4d0f-b438-11b3be02eabb-config-volume\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.263161 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e4a69df-b47c-4d0f-b438-11b3be02eabb-secret-volume\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.364799 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e4a69df-b47c-4d0f-b438-11b3be02eabb-secret-volume\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.364874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nczq\" (UniqueName: \"kubernetes.io/projected/0e4a69df-b47c-4d0f-b438-11b3be02eabb-kube-api-access-7nczq\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.364900 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e4a69df-b47c-4d0f-b438-11b3be02eabb-config-volume\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.365712 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e4a69df-b47c-4d0f-b438-11b3be02eabb-config-volume\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.374844 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e4a69df-b47c-4d0f-b438-11b3be02eabb-secret-volume\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.388650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nczq\" (UniqueName: \"kubernetes.io/projected/0e4a69df-b47c-4d0f-b438-11b3be02eabb-kube-api-access-7nczq\") pod \"collect-profiles-29334675-jdlfj\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.511879 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:00 crc kubenswrapper[4732]: I1010 07:15:00.822876 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj"] Oct 10 07:15:01 crc kubenswrapper[4732]: I1010 07:15:01.341147 4732 generic.go:334] "Generic (PLEG): container finished" podID="0e4a69df-b47c-4d0f-b438-11b3be02eabb" containerID="ed2c4ae9b06385d150392f311eae71369d6320f51e7558c8214bfde52c736ec5" exitCode=0 Oct 10 07:15:01 crc kubenswrapper[4732]: I1010 07:15:01.341226 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" event={"ID":"0e4a69df-b47c-4d0f-b438-11b3be02eabb","Type":"ContainerDied","Data":"ed2c4ae9b06385d150392f311eae71369d6320f51e7558c8214bfde52c736ec5"} Oct 10 07:15:01 crc kubenswrapper[4732]: I1010 07:15:01.341257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" event={"ID":"0e4a69df-b47c-4d0f-b438-11b3be02eabb","Type":"ContainerStarted","Data":"5b4643f94cd6e5da2093e1eb63e949e8eefe01856637e7e921ab59ceb293aa75"} Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.672023 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.801227 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e4a69df-b47c-4d0f-b438-11b3be02eabb-secret-volume\") pod \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.801410 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e4a69df-b47c-4d0f-b438-11b3be02eabb-config-volume\") pod \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.801469 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nczq\" (UniqueName: \"kubernetes.io/projected/0e4a69df-b47c-4d0f-b438-11b3be02eabb-kube-api-access-7nczq\") pod \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\" (UID: \"0e4a69df-b47c-4d0f-b438-11b3be02eabb\") " Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.802584 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e4a69df-b47c-4d0f-b438-11b3be02eabb-config-volume" (OuterVolumeSpecName: "config-volume") pod "0e4a69df-b47c-4d0f-b438-11b3be02eabb" (UID: "0e4a69df-b47c-4d0f-b438-11b3be02eabb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.810801 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e4a69df-b47c-4d0f-b438-11b3be02eabb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0e4a69df-b47c-4d0f-b438-11b3be02eabb" (UID: "0e4a69df-b47c-4d0f-b438-11b3be02eabb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.811005 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e4a69df-b47c-4d0f-b438-11b3be02eabb-kube-api-access-7nczq" (OuterVolumeSpecName: "kube-api-access-7nczq") pod "0e4a69df-b47c-4d0f-b438-11b3be02eabb" (UID: "0e4a69df-b47c-4d0f-b438-11b3be02eabb"). InnerVolumeSpecName "kube-api-access-7nczq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.903215 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0e4a69df-b47c-4d0f-b438-11b3be02eabb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.903243 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nczq\" (UniqueName: \"kubernetes.io/projected/0e4a69df-b47c-4d0f-b438-11b3be02eabb-kube-api-access-7nczq\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:02 crc kubenswrapper[4732]: I1010 07:15:02.903254 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0e4a69df-b47c-4d0f-b438-11b3be02eabb-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:03 crc kubenswrapper[4732]: I1010 07:15:03.360822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" event={"ID":"0e4a69df-b47c-4d0f-b438-11b3be02eabb","Type":"ContainerDied","Data":"5b4643f94cd6e5da2093e1eb63e949e8eefe01856637e7e921ab59ceb293aa75"} Oct 10 07:15:03 crc kubenswrapper[4732]: I1010 07:15:03.361053 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4643f94cd6e5da2093e1eb63e949e8eefe01856637e7e921ab59ceb293aa75" Oct 10 07:15:03 crc kubenswrapper[4732]: I1010 07:15:03.360916 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj" Oct 10 07:15:20 crc kubenswrapper[4732]: I1010 07:15:20.946644 4732 scope.go:117] "RemoveContainer" containerID="9313d24a807329b7bc908127acb7ed66fd3b29e3013536c8569a4d247c4da533" Oct 10 07:15:20 crc kubenswrapper[4732]: I1010 07:15:20.994404 4732 scope.go:117] "RemoveContainer" containerID="aec28078f83d5c42b3c9af9c5e0ca7f503376268d645f1072d4e6fcefa958f9b" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.023966 4732 scope.go:117] "RemoveContainer" containerID="07d7e976e6a44abd2a9a09997eb9283af5abbbe0a5f237f7567c1ac40c5bfedb" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.056030 4732 scope.go:117] "RemoveContainer" containerID="0c3dc5c04d7019b10ef2df774a2829edb174126d7b1beab27007801a2719c27c" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.090507 4732 scope.go:117] "RemoveContainer" containerID="c5860bbb5f8be0551d48d61d130a9c503672d099b5b9b81ad7f7930e2d1a74c7" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.115263 4732 scope.go:117] "RemoveContainer" containerID="f0961a2f7d699cccb05bc8db81a5efe1cac37f28387231f8e6ffe35fee84e710" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.142420 4732 scope.go:117] "RemoveContainer" containerID="34891118edba856958b17de5cd936c661fa4707c5c701fe15672ea786340be95" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.161852 4732 scope.go:117] "RemoveContainer" containerID="9ccbb101b60a4c5d1f4f9801ccb65f5ad73e384c6a69193236f1dbf249839c94" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.188306 4732 scope.go:117] "RemoveContainer" containerID="0e4ff507cdac3d345c7599b2f38569e72cbb904578805d9196ce57b351e8fc3a" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.218380 4732 scope.go:117] "RemoveContainer" containerID="65ed626b6b047272043b5c3de8ee0323c467e7f133e08e9a6d0b893eee050ead" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.237654 4732 scope.go:117] "RemoveContainer" containerID="f2f4bd53eac7d02bf28780a04513f842ce166a920111292a88480652aad2eaf7" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.253382 4732 scope.go:117] "RemoveContainer" containerID="3c1c2951ed8b32ebb3e4b066da8fbc88f5d886c4680c8466ab93868d6060fee4" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.270338 4732 scope.go:117] "RemoveContainer" containerID="c5af1d769dd390d1c03be51f1c6e582e04b0639cabbea6907be5cbb11d6351ba" Oct 10 07:15:21 crc kubenswrapper[4732]: I1010 07:15:21.298108 4732 scope.go:117] "RemoveContainer" containerID="94533af429c5430b6ef2119bcdaab1c9b6bf64102f9f0379d38c2cf1c7403d0d" Oct 10 07:15:25 crc kubenswrapper[4732]: I1010 07:15:25.356318 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:15:25 crc kubenswrapper[4732]: I1010 07:15:25.356774 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.672443 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2zm87"] Oct 10 07:15:40 crc kubenswrapper[4732]: E1010 07:15:40.674231 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e4a69df-b47c-4d0f-b438-11b3be02eabb" containerName="collect-profiles" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.674253 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e4a69df-b47c-4d0f-b438-11b3be02eabb" containerName="collect-profiles" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.674475 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e4a69df-b47c-4d0f-b438-11b3be02eabb" containerName="collect-profiles" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.676069 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.694424 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zm87"] Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.796965 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8pzx\" (UniqueName: \"kubernetes.io/projected/e7a69369-25f0-4c16-b5ed-7dc383076b0c-kube-api-access-p8pzx\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.797021 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-utilities\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.797059 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-catalog-content\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.898545 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8pzx\" (UniqueName: \"kubernetes.io/projected/e7a69369-25f0-4c16-b5ed-7dc383076b0c-kube-api-access-p8pzx\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.898596 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-utilities\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.898625 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-catalog-content\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.899103 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-catalog-content\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.899596 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-utilities\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:40 crc kubenswrapper[4732]: I1010 07:15:40.922871 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8pzx\" (UniqueName: \"kubernetes.io/projected/e7a69369-25f0-4c16-b5ed-7dc383076b0c-kube-api-access-p8pzx\") pod \"redhat-marketplace-2zm87\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:41 crc kubenswrapper[4732]: I1010 07:15:41.026194 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:41 crc kubenswrapper[4732]: I1010 07:15:41.269355 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zm87"] Oct 10 07:15:41 crc kubenswrapper[4732]: I1010 07:15:41.769418 4732 generic.go:334] "Generic (PLEG): container finished" podID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerID="2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70" exitCode=0 Oct 10 07:15:41 crc kubenswrapper[4732]: I1010 07:15:41.769522 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zm87" event={"ID":"e7a69369-25f0-4c16-b5ed-7dc383076b0c","Type":"ContainerDied","Data":"2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70"} Oct 10 07:15:41 crc kubenswrapper[4732]: I1010 07:15:41.770018 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zm87" event={"ID":"e7a69369-25f0-4c16-b5ed-7dc383076b0c","Type":"ContainerStarted","Data":"e24c463ed9366719fc30c14a9aeff12b289fe8a31ceae4cbd7889dbcbeab3160"} Oct 10 07:15:43 crc kubenswrapper[4732]: I1010 07:15:43.795210 4732 generic.go:334] "Generic (PLEG): container finished" podID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerID="cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166" exitCode=0 Oct 10 07:15:43 crc kubenswrapper[4732]: I1010 07:15:43.795263 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zm87" event={"ID":"e7a69369-25f0-4c16-b5ed-7dc383076b0c","Type":"ContainerDied","Data":"cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166"} Oct 10 07:15:44 crc kubenswrapper[4732]: I1010 07:15:44.829859 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zm87" event={"ID":"e7a69369-25f0-4c16-b5ed-7dc383076b0c","Type":"ContainerStarted","Data":"a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d"} Oct 10 07:15:44 crc kubenswrapper[4732]: I1010 07:15:44.860853 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2zm87" podStartSLOduration=2.419894867 podStartE2EDuration="4.860832367s" podCreationTimestamp="2025-10-10 07:15:40 +0000 UTC" firstStartedPulling="2025-10-10 07:15:41.774486208 +0000 UTC m=+1468.844077489" lastFinishedPulling="2025-10-10 07:15:44.215423748 +0000 UTC m=+1471.285014989" observedRunningTime="2025-10-10 07:15:44.857451544 +0000 UTC m=+1471.927042815" watchObservedRunningTime="2025-10-10 07:15:44.860832367 +0000 UTC m=+1471.930423608" Oct 10 07:15:51 crc kubenswrapper[4732]: I1010 07:15:51.026676 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:51 crc kubenswrapper[4732]: I1010 07:15:51.027301 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:51 crc kubenswrapper[4732]: I1010 07:15:51.075396 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:51 crc kubenswrapper[4732]: I1010 07:15:51.954152 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:52 crc kubenswrapper[4732]: I1010 07:15:52.005284 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zm87"] Oct 10 07:15:53 crc kubenswrapper[4732]: I1010 07:15:53.914944 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2zm87" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="registry-server" containerID="cri-o://a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d" gracePeriod=2 Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.378447 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.494459 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-catalog-content\") pod \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.494586 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8pzx\" (UniqueName: \"kubernetes.io/projected/e7a69369-25f0-4c16-b5ed-7dc383076b0c-kube-api-access-p8pzx\") pod \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.494657 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-utilities\") pod \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\" (UID: \"e7a69369-25f0-4c16-b5ed-7dc383076b0c\") " Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.495666 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-utilities" (OuterVolumeSpecName: "utilities") pod "e7a69369-25f0-4c16-b5ed-7dc383076b0c" (UID: "e7a69369-25f0-4c16-b5ed-7dc383076b0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.504062 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a69369-25f0-4c16-b5ed-7dc383076b0c-kube-api-access-p8pzx" (OuterVolumeSpecName: "kube-api-access-p8pzx") pod "e7a69369-25f0-4c16-b5ed-7dc383076b0c" (UID: "e7a69369-25f0-4c16-b5ed-7dc383076b0c"). InnerVolumeSpecName "kube-api-access-p8pzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.507491 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7a69369-25f0-4c16-b5ed-7dc383076b0c" (UID: "e7a69369-25f0-4c16-b5ed-7dc383076b0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.596179 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.596218 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7a69369-25f0-4c16-b5ed-7dc383076b0c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.596232 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8pzx\" (UniqueName: \"kubernetes.io/projected/e7a69369-25f0-4c16-b5ed-7dc383076b0c-kube-api-access-p8pzx\") on node \"crc\" DevicePath \"\"" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.943732 4732 generic.go:334] "Generic (PLEG): container finished" podID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerID="a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d" exitCode=0 Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.943803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zm87" event={"ID":"e7a69369-25f0-4c16-b5ed-7dc383076b0c","Type":"ContainerDied","Data":"a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d"} Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.943845 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2zm87" event={"ID":"e7a69369-25f0-4c16-b5ed-7dc383076b0c","Type":"ContainerDied","Data":"e24c463ed9366719fc30c14a9aeff12b289fe8a31ceae4cbd7889dbcbeab3160"} Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.943867 4732 scope.go:117] "RemoveContainer" containerID="a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.943889 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2zm87" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.972371 4732 scope.go:117] "RemoveContainer" containerID="cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166" Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.981018 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zm87"] Oct 10 07:15:54 crc kubenswrapper[4732]: I1010 07:15:54.987870 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2zm87"] Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.014933 4732 scope.go:117] "RemoveContainer" containerID="2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.042178 4732 scope.go:117] "RemoveContainer" containerID="a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d" Oct 10 07:15:55 crc kubenswrapper[4732]: E1010 07:15:55.043033 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d\": container with ID starting with a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d not found: ID does not exist" containerID="a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.043068 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d"} err="failed to get container status \"a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d\": rpc error: code = NotFound desc = could not find container \"a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d\": container with ID starting with a53534b1c295e25f1da109071f4f64360df5415f31e5e8e0b1915c126bf0049d not found: ID does not exist" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.043117 4732 scope.go:117] "RemoveContainer" containerID="cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166" Oct 10 07:15:55 crc kubenswrapper[4732]: E1010 07:15:55.043556 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166\": container with ID starting with cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166 not found: ID does not exist" containerID="cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.043580 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166"} err="failed to get container status \"cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166\": rpc error: code = NotFound desc = could not find container \"cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166\": container with ID starting with cb4011d524319ffe804f0088f94f33d87341ac9bc344e581e008692a41a8f166 not found: ID does not exist" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.043593 4732 scope.go:117] "RemoveContainer" containerID="2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70" Oct 10 07:15:55 crc kubenswrapper[4732]: E1010 07:15:55.043978 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70\": container with ID starting with 2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70 not found: ID does not exist" containerID="2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.044003 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70"} err="failed to get container status \"2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70\": rpc error: code = NotFound desc = could not find container \"2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70\": container with ID starting with 2e671d4856a9305b80f3e536d4db2f57bd9b55830a4cd322cc85ae380375ff70 not found: ID does not exist" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.356554 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.357007 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.357111 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.358297 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.358411 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" gracePeriod=600 Oct 10 07:15:55 crc kubenswrapper[4732]: E1010 07:15:55.483526 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.676823 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" path="/var/lib/kubelet/pods/e7a69369-25f0-4c16-b5ed-7dc383076b0c/volumes" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.960075 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" exitCode=0 Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.960125 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3"} Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.960158 4732 scope.go:117] "RemoveContainer" containerID="58650635f1fbb6c2c9e22b572de1a4b4db4e63148a8b451b4d739ea558750b87" Oct 10 07:15:55 crc kubenswrapper[4732]: I1010 07:15:55.962104 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:15:55 crc kubenswrapper[4732]: E1010 07:15:55.962504 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.420662 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4kpnf"] Oct 10 07:15:58 crc kubenswrapper[4732]: E1010 07:15:58.421122 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="extract-utilities" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.421142 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="extract-utilities" Oct 10 07:15:58 crc kubenswrapper[4732]: E1010 07:15:58.421166 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="extract-content" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.421177 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="extract-content" Oct 10 07:15:58 crc kubenswrapper[4732]: E1010 07:15:58.421219 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="registry-server" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.421233 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="registry-server" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.421483 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a69369-25f0-4c16-b5ed-7dc383076b0c" containerName="registry-server" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.423926 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.448380 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4kpnf"] Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.572712 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-utilities\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.572813 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-catalog-content\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.572839 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7swjn\" (UniqueName: \"kubernetes.io/projected/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-kube-api-access-7swjn\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.673760 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-catalog-content\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.673806 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7swjn\" (UniqueName: \"kubernetes.io/projected/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-kube-api-access-7swjn\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.673852 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-utilities\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.674295 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-catalog-content\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.674333 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-utilities\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.699983 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7swjn\" (UniqueName: \"kubernetes.io/projected/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-kube-api-access-7swjn\") pod \"certified-operators-4kpnf\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:58 crc kubenswrapper[4732]: I1010 07:15:58.752021 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:15:59 crc kubenswrapper[4732]: I1010 07:15:59.212060 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4kpnf"] Oct 10 07:16:00 crc kubenswrapper[4732]: I1010 07:16:00.020508 4732 generic.go:334] "Generic (PLEG): container finished" podID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerID="8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79" exitCode=0 Oct 10 07:16:00 crc kubenswrapper[4732]: I1010 07:16:00.020563 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kpnf" event={"ID":"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2","Type":"ContainerDied","Data":"8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79"} Oct 10 07:16:00 crc kubenswrapper[4732]: I1010 07:16:00.020596 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kpnf" event={"ID":"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2","Type":"ContainerStarted","Data":"0123808f77a7229351d44af4b5326c764bb2939584332bd5bddc7a7ca75a4a98"} Oct 10 07:16:02 crc kubenswrapper[4732]: I1010 07:16:02.042523 4732 generic.go:334] "Generic (PLEG): container finished" podID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerID="5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4" exitCode=0 Oct 10 07:16:02 crc kubenswrapper[4732]: I1010 07:16:02.043108 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kpnf" event={"ID":"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2","Type":"ContainerDied","Data":"5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4"} Oct 10 07:16:02 crc kubenswrapper[4732]: E1010 07:16:02.124666 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe8f06a_f4e8_45be_85dd_357fc70fc0e2.slice/crio-5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4.scope\": RecentStats: unable to find data in memory cache]" Oct 10 07:16:03 crc kubenswrapper[4732]: I1010 07:16:03.058324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kpnf" event={"ID":"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2","Type":"ContainerStarted","Data":"1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d"} Oct 10 07:16:03 crc kubenswrapper[4732]: I1010 07:16:03.097170 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4kpnf" podStartSLOduration=2.613634223 podStartE2EDuration="5.09714464s" podCreationTimestamp="2025-10-10 07:15:58 +0000 UTC" firstStartedPulling="2025-10-10 07:16:00.027936965 +0000 UTC m=+1487.097528246" lastFinishedPulling="2025-10-10 07:16:02.511447382 +0000 UTC m=+1489.581038663" observedRunningTime="2025-10-10 07:16:03.095203657 +0000 UTC m=+1490.164794988" watchObservedRunningTime="2025-10-10 07:16:03.09714464 +0000 UTC m=+1490.166735911" Oct 10 07:16:08 crc kubenswrapper[4732]: I1010 07:16:08.660363 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:16:08 crc kubenswrapper[4732]: E1010 07:16:08.661134 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:16:08 crc kubenswrapper[4732]: I1010 07:16:08.752652 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:16:08 crc kubenswrapper[4732]: I1010 07:16:08.752797 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:16:08 crc kubenswrapper[4732]: I1010 07:16:08.825579 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:16:09 crc kubenswrapper[4732]: I1010 07:16:09.184586 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:16:09 crc kubenswrapper[4732]: I1010 07:16:09.247264 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4kpnf"] Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.147355 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4kpnf" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="registry-server" containerID="cri-o://1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d" gracePeriod=2 Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.610648 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.674523 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7swjn\" (UniqueName: \"kubernetes.io/projected/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-kube-api-access-7swjn\") pod \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.674614 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-utilities\") pod \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.674657 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-catalog-content\") pod \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\" (UID: \"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2\") " Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.676743 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-utilities" (OuterVolumeSpecName: "utilities") pod "dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" (UID: "dfe8f06a-f4e8-45be-85dd-357fc70fc0e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.683435 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-kube-api-access-7swjn" (OuterVolumeSpecName: "kube-api-access-7swjn") pod "dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" (UID: "dfe8f06a-f4e8-45be-85dd-357fc70fc0e2"). InnerVolumeSpecName "kube-api-access-7swjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.742569 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" (UID: "dfe8f06a-f4e8-45be-85dd-357fc70fc0e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.777173 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.777224 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:16:11 crc kubenswrapper[4732]: I1010 07:16:11.777245 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7swjn\" (UniqueName: \"kubernetes.io/projected/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2-kube-api-access-7swjn\") on node \"crc\" DevicePath \"\"" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.162749 4732 generic.go:334] "Generic (PLEG): container finished" podID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerID="1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d" exitCode=0 Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.162894 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kpnf" event={"ID":"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2","Type":"ContainerDied","Data":"1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d"} Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.162983 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4kpnf" event={"ID":"dfe8f06a-f4e8-45be-85dd-357fc70fc0e2","Type":"ContainerDied","Data":"0123808f77a7229351d44af4b5326c764bb2939584332bd5bddc7a7ca75a4a98"} Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.163030 4732 scope.go:117] "RemoveContainer" containerID="1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.162918 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4kpnf" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.189799 4732 scope.go:117] "RemoveContainer" containerID="5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.217388 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4kpnf"] Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.223325 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4kpnf"] Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.243803 4732 scope.go:117] "RemoveContainer" containerID="8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.262808 4732 scope.go:117] "RemoveContainer" containerID="1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d" Oct 10 07:16:12 crc kubenswrapper[4732]: E1010 07:16:12.263372 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d\": container with ID starting with 1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d not found: ID does not exist" containerID="1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.263455 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d"} err="failed to get container status \"1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d\": rpc error: code = NotFound desc = could not find container \"1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d\": container with ID starting with 1aae198b87a5d3cf0adae42f248c7b6e8bbca67822aacad539e40d76c645d79d not found: ID does not exist" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.263490 4732 scope.go:117] "RemoveContainer" containerID="5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4" Oct 10 07:16:12 crc kubenswrapper[4732]: E1010 07:16:12.266324 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4\": container with ID starting with 5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4 not found: ID does not exist" containerID="5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.266360 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4"} err="failed to get container status \"5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4\": rpc error: code = NotFound desc = could not find container \"5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4\": container with ID starting with 5c4189618939d63ee6ecabd879dc1c252b980ba35269f4ac9e833a710ed45cc4 not found: ID does not exist" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.266385 4732 scope.go:117] "RemoveContainer" containerID="8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79" Oct 10 07:16:12 crc kubenswrapper[4732]: E1010 07:16:12.266607 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79\": container with ID starting with 8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79 not found: ID does not exist" containerID="8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79" Oct 10 07:16:12 crc kubenswrapper[4732]: I1010 07:16:12.266628 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79"} err="failed to get container status \"8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79\": rpc error: code = NotFound desc = could not find container \"8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79\": container with ID starting with 8821b70d53024576a37cceab513262afa4bbe89704ca09bfaa5747f367600d79 not found: ID does not exist" Oct 10 07:16:12 crc kubenswrapper[4732]: E1010 07:16:12.311268 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe8f06a_f4e8_45be_85dd_357fc70fc0e2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe8f06a_f4e8_45be_85dd_357fc70fc0e2.slice/crio-0123808f77a7229351d44af4b5326c764bb2939584332bd5bddc7a7ca75a4a98\": RecentStats: unable to find data in memory cache]" Oct 10 07:16:13 crc kubenswrapper[4732]: I1010 07:16:13.678070 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" path="/var/lib/kubelet/pods/dfe8f06a-f4e8-45be-85dd-357fc70fc0e2/volumes" Oct 10 07:16:19 crc kubenswrapper[4732]: I1010 07:16:19.662634 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:16:19 crc kubenswrapper[4732]: E1010 07:16:19.664012 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.592075 4732 scope.go:117] "RemoveContainer" containerID="b767fdf343e2f054e157561352aab0e3261a07fc8d390dbd234c4209ebbdc59a" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.651255 4732 scope.go:117] "RemoveContainer" containerID="dde9f40ba27d41aad5ca56d92142e35bed603ba909423eba0d09a696a6b9a237" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.670067 4732 scope.go:117] "RemoveContainer" containerID="f33cbdea2122f73beec7de26dbafdccd714f4862f702da77f5ed82dd6b03e431" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.707489 4732 scope.go:117] "RemoveContainer" containerID="62b7b16cc6a1f4bc0fab4fef610ce804f0be794fe762edd6193d6881fc145b47" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.729602 4732 scope.go:117] "RemoveContainer" containerID="b7464fe56625fd8bdabbebfebe13747ad27ebcc9839bed3c455e68309f1b3a7d" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.762938 4732 scope.go:117] "RemoveContainer" containerID="d1b9c9ec1328c0bb6e1e23ba7b0563f2efb58e6dd7800267dc005eab16e1c685" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.807106 4732 scope.go:117] "RemoveContainer" containerID="f8c61dd771da3d41b95b01e1da13533f302a1c15793a69ab61292b755af3bd72" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.861766 4732 scope.go:117] "RemoveContainer" containerID="c1f28765adbde199f44b27a3b0a6b9a0ff884e2ac8249464dee32eb1609af819" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.883826 4732 scope.go:117] "RemoveContainer" containerID="33f31a62b7e6468ccf5c144e82ff3c82dc099694dfba101874c4dfcecc1dcd59" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.904499 4732 scope.go:117] "RemoveContainer" containerID="c932149ea67159f9a84814065ec885afcf27d6ffe2cca727317cb446e3e1836a" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.942915 4732 scope.go:117] "RemoveContainer" containerID="ba7a1f03f18ae86234997ab8ec3532045109f6ac2550d2fdf25633eb2d62be0a" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.964614 4732 scope.go:117] "RemoveContainer" containerID="9e5587596a7f6545f5ee41c7fe004abf66a409e2bfb223d64c2066b916dae202" Oct 10 07:16:21 crc kubenswrapper[4732]: I1010 07:16:21.978162 4732 scope.go:117] "RemoveContainer" containerID="a75822e5a435154e34f2342407e93653c250150aa62c0761fe0af6d275499cd7" Oct 10 07:16:30 crc kubenswrapper[4732]: I1010 07:16:30.660725 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:16:30 crc kubenswrapper[4732]: E1010 07:16:30.661624 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:16:42 crc kubenswrapper[4732]: I1010 07:16:42.661169 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:16:42 crc kubenswrapper[4732]: E1010 07:16:42.662187 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:16:54 crc kubenswrapper[4732]: I1010 07:16:54.660175 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:16:54 crc kubenswrapper[4732]: E1010 07:16:54.660901 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:17:08 crc kubenswrapper[4732]: I1010 07:17:08.660174 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:17:08 crc kubenswrapper[4732]: E1010 07:17:08.661092 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:17:20 crc kubenswrapper[4732]: I1010 07:17:20.660762 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:17:20 crc kubenswrapper[4732]: E1010 07:17:20.661651 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.182743 4732 scope.go:117] "RemoveContainer" containerID="1f3d13873970bd134f2543c565b451c32ae0541427673f4a084babf5197a72ef" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.212903 4732 scope.go:117] "RemoveContainer" containerID="f109eab8f1fb8cc71f4902a857174c8040a04484109856debe238368907364e0" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.247033 4732 scope.go:117] "RemoveContainer" containerID="b888ac0b6d592ec667eb861df78abe425788617cf262c3af3800b0ec2cf59863" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.267796 4732 scope.go:117] "RemoveContainer" containerID="58d341eb205d877224a5cb6a46597c2de42b015fa39f99b361d6a4d67ded1cc4" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.289194 4732 scope.go:117] "RemoveContainer" containerID="cc76fa90e7b162f0b66e824eee5ff268aceed1434ce758c6900d6e7104073f19" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.307150 4732 scope.go:117] "RemoveContainer" containerID="f4d100a6f0ecddf29e4d78c67e5019b69856d21ffbb0b74a4b0262657c6c0304" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.327746 4732 scope.go:117] "RemoveContainer" containerID="8b20c40d1dd65dff94ce2787bef4e91341b5943f617e24c08f9f51746adf32f3" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.378770 4732 scope.go:117] "RemoveContainer" containerID="81bc0eabbce19d596497bbee4c0eaceeb22f5b1132be590874c75ea9e4b56d03" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.404974 4732 scope.go:117] "RemoveContainer" containerID="a2fde98389bfd263442de180a19d9928051eed6c67f10e41cd4202a46c1c7e22" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.425939 4732 scope.go:117] "RemoveContainer" containerID="3cd3567830dcb39ce650b51a0ea69fd5975608aac30a059595b9a8f437180072" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.450980 4732 scope.go:117] "RemoveContainer" containerID="641a6cdf412db0e2310d7f541cc247d83efc9d202289b3873b100377259ddddd" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.477965 4732 scope.go:117] "RemoveContainer" containerID="f366ccf0fd7eff9163283eb01f40b778944fbee5750e2fdcbc35a6bd70d5f9a8" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.495036 4732 scope.go:117] "RemoveContainer" containerID="d76c61903df35f0f8176003951cca022e8081f2049617d8550fff70d06901f35" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.519216 4732 scope.go:117] "RemoveContainer" containerID="e8717780db17ff634dedc1118d77f3a0be22750ac0a7490e3e813d6089986154" Oct 10 07:17:22 crc kubenswrapper[4732]: I1010 07:17:22.542491 4732 scope.go:117] "RemoveContainer" containerID="10d139c980b60b956965d5489b41546b308f38d9be28a63508a7305b312c69c4" Oct 10 07:17:35 crc kubenswrapper[4732]: I1010 07:17:35.661103 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:17:35 crc kubenswrapper[4732]: E1010 07:17:35.662040 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:17:49 crc kubenswrapper[4732]: I1010 07:17:49.660219 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:17:49 crc kubenswrapper[4732]: E1010 07:17:49.661055 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:18:00 crc kubenswrapper[4732]: I1010 07:18:00.660259 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:18:00 crc kubenswrapper[4732]: E1010 07:18:00.661233 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:18:11 crc kubenswrapper[4732]: I1010 07:18:11.660850 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:18:11 crc kubenswrapper[4732]: E1010 07:18:11.661651 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:18:22 crc kubenswrapper[4732]: I1010 07:18:22.660613 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:18:22 crc kubenswrapper[4732]: E1010 07:18:22.661792 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:18:22 crc kubenswrapper[4732]: I1010 07:18:22.729988 4732 scope.go:117] "RemoveContainer" containerID="21ceed9bdb5694277a6fbcae24c2d9f361e101085a3956c5cf75871537da4926" Oct 10 07:18:22 crc kubenswrapper[4732]: I1010 07:18:22.756158 4732 scope.go:117] "RemoveContainer" containerID="f9506fceaa77699397e7b29b0e67d5a568de582fd92174a2332250afa9eed955" Oct 10 07:18:22 crc kubenswrapper[4732]: I1010 07:18:22.785616 4732 scope.go:117] "RemoveContainer" containerID="073506bc2a6aa3d7b04eef8fddc9eaec5dc5670eb489148b26507efe9a484841" Oct 10 07:18:22 crc kubenswrapper[4732]: I1010 07:18:22.812653 4732 scope.go:117] "RemoveContainer" containerID="37cc9165f0d1920ae33dae00b95b24914cdd3fbd5dce1ff153580c83b03a2d77" Oct 10 07:18:22 crc kubenswrapper[4732]: I1010 07:18:22.862858 4732 scope.go:117] "RemoveContainer" containerID="04b3bf834ce0e2c1ec7e80b92fd88bfde1cce05f5dc166ae8556a77592cda914" Oct 10 07:18:35 crc kubenswrapper[4732]: I1010 07:18:35.661095 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:18:35 crc kubenswrapper[4732]: E1010 07:18:35.661971 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:18:46 crc kubenswrapper[4732]: I1010 07:18:46.660259 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:18:46 crc kubenswrapper[4732]: E1010 07:18:46.661365 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:19:00 crc kubenswrapper[4732]: I1010 07:19:00.660015 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:19:00 crc kubenswrapper[4732]: E1010 07:19:00.661330 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:19:12 crc kubenswrapper[4732]: I1010 07:19:12.661301 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:19:12 crc kubenswrapper[4732]: E1010 07:19:12.662497 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:19:23 crc kubenswrapper[4732]: I1010 07:19:23.007281 4732 scope.go:117] "RemoveContainer" containerID="d540a221c0791b2252e725e31d91ce55eb96da863e87dd65bb8ea6b3079291e6" Oct 10 07:19:23 crc kubenswrapper[4732]: I1010 07:19:23.666113 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:19:23 crc kubenswrapper[4732]: E1010 07:19:23.666979 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:19:35 crc kubenswrapper[4732]: I1010 07:19:35.660279 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:19:35 crc kubenswrapper[4732]: E1010 07:19:35.661049 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:19:50 crc kubenswrapper[4732]: I1010 07:19:50.661243 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:19:50 crc kubenswrapper[4732]: E1010 07:19:50.664454 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:20:04 crc kubenswrapper[4732]: I1010 07:20:04.660810 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:20:04 crc kubenswrapper[4732]: E1010 07:20:04.661766 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:20:17 crc kubenswrapper[4732]: I1010 07:20:17.660736 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:20:17 crc kubenswrapper[4732]: E1010 07:20:17.661408 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:20:29 crc kubenswrapper[4732]: I1010 07:20:29.660737 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:20:29 crc kubenswrapper[4732]: E1010 07:20:29.661519 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:20:44 crc kubenswrapper[4732]: I1010 07:20:44.660464 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:20:44 crc kubenswrapper[4732]: E1010 07:20:44.661453 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:20:56 crc kubenswrapper[4732]: I1010 07:20:56.660937 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:20:56 crc kubenswrapper[4732]: I1010 07:20:56.883782 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"47ff4480f036b42a35a03b5ca25dbd459d4294026c75ade3a4233c6e6dc96947"} Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.193845 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n6765"] Oct 10 07:21:38 crc kubenswrapper[4732]: E1010 07:21:38.195288 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="extract-content" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.195325 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="extract-content" Oct 10 07:21:38 crc kubenswrapper[4732]: E1010 07:21:38.195346 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="extract-utilities" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.195364 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="extract-utilities" Oct 10 07:21:38 crc kubenswrapper[4732]: E1010 07:21:38.195395 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="registry-server" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.195413 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="registry-server" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.195834 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe8f06a-f4e8-45be-85dd-357fc70fc0e2" containerName="registry-server" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.198423 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.206793 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6765"] Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.250472 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-catalog-content\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.250638 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp8qc\" (UniqueName: \"kubernetes.io/projected/cff1d2b6-6f33-459c-bf20-48cbc1463d16-kube-api-access-gp8qc\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.250869 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-utilities\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.352006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-catalog-content\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.352084 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp8qc\" (UniqueName: \"kubernetes.io/projected/cff1d2b6-6f33-459c-bf20-48cbc1463d16-kube-api-access-gp8qc\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.352131 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-utilities\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.352712 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-utilities\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.352711 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-catalog-content\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.377730 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp8qc\" (UniqueName: \"kubernetes.io/projected/cff1d2b6-6f33-459c-bf20-48cbc1463d16-kube-api-access-gp8qc\") pod \"community-operators-n6765\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.531004 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:38 crc kubenswrapper[4732]: I1010 07:21:38.998473 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n6765"] Oct 10 07:21:39 crc kubenswrapper[4732]: I1010 07:21:39.268173 4732 generic.go:334] "Generic (PLEG): container finished" podID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerID="17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0" exitCode=0 Oct 10 07:21:39 crc kubenswrapper[4732]: I1010 07:21:39.268224 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6765" event={"ID":"cff1d2b6-6f33-459c-bf20-48cbc1463d16","Type":"ContainerDied","Data":"17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0"} Oct 10 07:21:39 crc kubenswrapper[4732]: I1010 07:21:39.268256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6765" event={"ID":"cff1d2b6-6f33-459c-bf20-48cbc1463d16","Type":"ContainerStarted","Data":"44f4376913505bd9c0ec2df7422955d951c2a496da3029223e19b5c6e854a250"} Oct 10 07:21:39 crc kubenswrapper[4732]: I1010 07:21:39.271088 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:21:40 crc kubenswrapper[4732]: I1010 07:21:40.282276 4732 generic.go:334] "Generic (PLEG): container finished" podID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerID="72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e" exitCode=0 Oct 10 07:21:40 crc kubenswrapper[4732]: I1010 07:21:40.282880 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6765" event={"ID":"cff1d2b6-6f33-459c-bf20-48cbc1463d16","Type":"ContainerDied","Data":"72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e"} Oct 10 07:21:41 crc kubenswrapper[4732]: I1010 07:21:41.294291 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6765" event={"ID":"cff1d2b6-6f33-459c-bf20-48cbc1463d16","Type":"ContainerStarted","Data":"565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd"} Oct 10 07:21:41 crc kubenswrapper[4732]: I1010 07:21:41.333322 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n6765" podStartSLOduration=1.871654634 podStartE2EDuration="3.33329167s" podCreationTimestamp="2025-10-10 07:21:38 +0000 UTC" firstStartedPulling="2025-10-10 07:21:39.270610768 +0000 UTC m=+1826.340202049" lastFinishedPulling="2025-10-10 07:21:40.732247834 +0000 UTC m=+1827.801839085" observedRunningTime="2025-10-10 07:21:41.320483092 +0000 UTC m=+1828.390074403" watchObservedRunningTime="2025-10-10 07:21:41.33329167 +0000 UTC m=+1828.402882951" Oct 10 07:21:48 crc kubenswrapper[4732]: I1010 07:21:48.532056 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:48 crc kubenswrapper[4732]: I1010 07:21:48.532937 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:48 crc kubenswrapper[4732]: I1010 07:21:48.592589 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:49 crc kubenswrapper[4732]: I1010 07:21:49.453030 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:49 crc kubenswrapper[4732]: I1010 07:21:49.537974 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6765"] Oct 10 07:21:51 crc kubenswrapper[4732]: I1010 07:21:51.386666 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n6765" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="registry-server" containerID="cri-o://565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd" gracePeriod=2 Oct 10 07:21:51 crc kubenswrapper[4732]: I1010 07:21:51.825467 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.005169 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-catalog-content\") pod \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.005399 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp8qc\" (UniqueName: \"kubernetes.io/projected/cff1d2b6-6f33-459c-bf20-48cbc1463d16-kube-api-access-gp8qc\") pod \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.005455 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-utilities\") pod \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\" (UID: \"cff1d2b6-6f33-459c-bf20-48cbc1463d16\") " Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.006526 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-utilities" (OuterVolumeSpecName: "utilities") pod "cff1d2b6-6f33-459c-bf20-48cbc1463d16" (UID: "cff1d2b6-6f33-459c-bf20-48cbc1463d16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.018113 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff1d2b6-6f33-459c-bf20-48cbc1463d16-kube-api-access-gp8qc" (OuterVolumeSpecName: "kube-api-access-gp8qc") pod "cff1d2b6-6f33-459c-bf20-48cbc1463d16" (UID: "cff1d2b6-6f33-459c-bf20-48cbc1463d16"). InnerVolumeSpecName "kube-api-access-gp8qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.091583 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cff1d2b6-6f33-459c-bf20-48cbc1463d16" (UID: "cff1d2b6-6f33-459c-bf20-48cbc1463d16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.107345 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.107391 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp8qc\" (UniqueName: \"kubernetes.io/projected/cff1d2b6-6f33-459c-bf20-48cbc1463d16-kube-api-access-gp8qc\") on node \"crc\" DevicePath \"\"" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.107407 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cff1d2b6-6f33-459c-bf20-48cbc1463d16-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.399371 4732 generic.go:334] "Generic (PLEG): container finished" podID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerID="565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd" exitCode=0 Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.399438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6765" event={"ID":"cff1d2b6-6f33-459c-bf20-48cbc1463d16","Type":"ContainerDied","Data":"565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd"} Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.399504 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n6765" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.399580 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n6765" event={"ID":"cff1d2b6-6f33-459c-bf20-48cbc1463d16","Type":"ContainerDied","Data":"44f4376913505bd9c0ec2df7422955d951c2a496da3029223e19b5c6e854a250"} Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.399612 4732 scope.go:117] "RemoveContainer" containerID="565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.432199 4732 scope.go:117] "RemoveContainer" containerID="72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.444120 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n6765"] Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.452735 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n6765"] Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.467959 4732 scope.go:117] "RemoveContainer" containerID="17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.499978 4732 scope.go:117] "RemoveContainer" containerID="565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd" Oct 10 07:21:52 crc kubenswrapper[4732]: E1010 07:21:52.500514 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd\": container with ID starting with 565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd not found: ID does not exist" containerID="565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.500606 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd"} err="failed to get container status \"565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd\": rpc error: code = NotFound desc = could not find container \"565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd\": container with ID starting with 565c6d94236488edb3ce24e7904b5e5460292142060ee9242ce8585be7a99cdd not found: ID does not exist" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.500650 4732 scope.go:117] "RemoveContainer" containerID="72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e" Oct 10 07:21:52 crc kubenswrapper[4732]: E1010 07:21:52.501615 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e\": container with ID starting with 72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e not found: ID does not exist" containerID="72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.502312 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e"} err="failed to get container status \"72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e\": rpc error: code = NotFound desc = could not find container \"72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e\": container with ID starting with 72dd20487b0192cc593e5a237ec910df08f1c0a97284dfcee47329baf9b8ba1e not found: ID does not exist" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.502357 4732 scope.go:117] "RemoveContainer" containerID="17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0" Oct 10 07:21:52 crc kubenswrapper[4732]: E1010 07:21:52.503444 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0\": container with ID starting with 17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0 not found: ID does not exist" containerID="17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0" Oct 10 07:21:52 crc kubenswrapper[4732]: I1010 07:21:52.503487 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0"} err="failed to get container status \"17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0\": rpc error: code = NotFound desc = could not find container \"17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0\": container with ID starting with 17b68bb216028bf8ad67850b298c50aeb8a0232026ebdc5366996b6a3bdf34e0 not found: ID does not exist" Oct 10 07:21:53 crc kubenswrapper[4732]: I1010 07:21:53.697546 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" path="/var/lib/kubelet/pods/cff1d2b6-6f33-459c-bf20-48cbc1463d16/volumes" Oct 10 07:23:25 crc kubenswrapper[4732]: I1010 07:23:25.356219 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:23:25 crc kubenswrapper[4732]: I1010 07:23:25.356875 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:23:55 crc kubenswrapper[4732]: I1010 07:23:55.355557 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:23:55 crc kubenswrapper[4732]: I1010 07:23:55.356285 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.356418 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.357035 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.357117 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.358095 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47ff4480f036b42a35a03b5ca25dbd459d4294026c75ade3a4233c6e6dc96947"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.358207 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://47ff4480f036b42a35a03b5ca25dbd459d4294026c75ade3a4233c6e6dc96947" gracePeriod=600 Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.813883 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="47ff4480f036b42a35a03b5ca25dbd459d4294026c75ade3a4233c6e6dc96947" exitCode=0 Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.813953 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"47ff4480f036b42a35a03b5ca25dbd459d4294026c75ade3a4233c6e6dc96947"} Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.814061 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713"} Oct 10 07:24:25 crc kubenswrapper[4732]: I1010 07:24:25.814104 4732 scope.go:117] "RemoveContainer" containerID="c0afa98762e14da8ee91c2cff1cd2b29a8a217f54454b039cb71ccb28b7b29d3" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.777728 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jfdsp"] Oct 10 07:25:24 crc kubenswrapper[4732]: E1010 07:25:24.778504 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="extract-content" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.778517 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="extract-content" Oct 10 07:25:24 crc kubenswrapper[4732]: E1010 07:25:24.778547 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="registry-server" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.778554 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="registry-server" Oct 10 07:25:24 crc kubenswrapper[4732]: E1010 07:25:24.778565 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="extract-utilities" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.778570 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="extract-utilities" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.778751 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff1d2b6-6f33-459c-bf20-48cbc1463d16" containerName="registry-server" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.779678 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.823638 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pl99\" (UniqueName: \"kubernetes.io/projected/66e063fb-a32c-4f21-8c19-8788208c2c3d-kube-api-access-9pl99\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.823752 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-catalog-content\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.823787 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-utilities\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.838442 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfdsp"] Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.930025 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-catalog-content\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.930472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-utilities\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.930647 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pl99\" (UniqueName: \"kubernetes.io/projected/66e063fb-a32c-4f21-8c19-8788208c2c3d-kube-api-access-9pl99\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.930861 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-utilities\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.930652 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-catalog-content\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:24 crc kubenswrapper[4732]: I1010 07:25:24.961139 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pl99\" (UniqueName: \"kubernetes.io/projected/66e063fb-a32c-4f21-8c19-8788208c2c3d-kube-api-access-9pl99\") pod \"redhat-operators-jfdsp\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:25 crc kubenswrapper[4732]: I1010 07:25:25.106277 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:25 crc kubenswrapper[4732]: I1010 07:25:25.588548 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jfdsp"] Oct 10 07:25:26 crc kubenswrapper[4732]: I1010 07:25:26.364676 4732 generic.go:334] "Generic (PLEG): container finished" podID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerID="df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19" exitCode=0 Oct 10 07:25:26 crc kubenswrapper[4732]: I1010 07:25:26.364863 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfdsp" event={"ID":"66e063fb-a32c-4f21-8c19-8788208c2c3d","Type":"ContainerDied","Data":"df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19"} Oct 10 07:25:26 crc kubenswrapper[4732]: I1010 07:25:26.365111 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfdsp" event={"ID":"66e063fb-a32c-4f21-8c19-8788208c2c3d","Type":"ContainerStarted","Data":"51c7c6f778ee34f2d7e7579d0834173a028d4ad96d1e9c95a4505ba9038fcf26"} Oct 10 07:25:27 crc kubenswrapper[4732]: I1010 07:25:27.375315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfdsp" event={"ID":"66e063fb-a32c-4f21-8c19-8788208c2c3d","Type":"ContainerStarted","Data":"67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc"} Oct 10 07:25:28 crc kubenswrapper[4732]: I1010 07:25:28.384723 4732 generic.go:334] "Generic (PLEG): container finished" podID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerID="67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc" exitCode=0 Oct 10 07:25:28 crc kubenswrapper[4732]: I1010 07:25:28.384762 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfdsp" event={"ID":"66e063fb-a32c-4f21-8c19-8788208c2c3d","Type":"ContainerDied","Data":"67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc"} Oct 10 07:25:29 crc kubenswrapper[4732]: I1010 07:25:29.391684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfdsp" event={"ID":"66e063fb-a32c-4f21-8c19-8788208c2c3d","Type":"ContainerStarted","Data":"cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba"} Oct 10 07:25:29 crc kubenswrapper[4732]: I1010 07:25:29.413013 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jfdsp" podStartSLOduration=2.877971468 podStartE2EDuration="5.41299284s" podCreationTimestamp="2025-10-10 07:25:24 +0000 UTC" firstStartedPulling="2025-10-10 07:25:26.366044682 +0000 UTC m=+2053.435635923" lastFinishedPulling="2025-10-10 07:25:28.901066034 +0000 UTC m=+2055.970657295" observedRunningTime="2025-10-10 07:25:29.409819274 +0000 UTC m=+2056.479410525" watchObservedRunningTime="2025-10-10 07:25:29.41299284 +0000 UTC m=+2056.482584091" Oct 10 07:25:35 crc kubenswrapper[4732]: I1010 07:25:35.107909 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:35 crc kubenswrapper[4732]: I1010 07:25:35.108631 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:35 crc kubenswrapper[4732]: I1010 07:25:35.181665 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:35 crc kubenswrapper[4732]: I1010 07:25:35.515926 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:35 crc kubenswrapper[4732]: I1010 07:25:35.584611 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfdsp"] Oct 10 07:25:37 crc kubenswrapper[4732]: I1010 07:25:37.457381 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jfdsp" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="registry-server" containerID="cri-o://cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba" gracePeriod=2 Oct 10 07:25:37 crc kubenswrapper[4732]: I1010 07:25:37.907458 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.024284 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-utilities\") pod \"66e063fb-a32c-4f21-8c19-8788208c2c3d\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.024350 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-catalog-content\") pod \"66e063fb-a32c-4f21-8c19-8788208c2c3d\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.024407 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pl99\" (UniqueName: \"kubernetes.io/projected/66e063fb-a32c-4f21-8c19-8788208c2c3d-kube-api-access-9pl99\") pod \"66e063fb-a32c-4f21-8c19-8788208c2c3d\" (UID: \"66e063fb-a32c-4f21-8c19-8788208c2c3d\") " Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.026038 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-utilities" (OuterVolumeSpecName: "utilities") pod "66e063fb-a32c-4f21-8c19-8788208c2c3d" (UID: "66e063fb-a32c-4f21-8c19-8788208c2c3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.033924 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66e063fb-a32c-4f21-8c19-8788208c2c3d-kube-api-access-9pl99" (OuterVolumeSpecName: "kube-api-access-9pl99") pod "66e063fb-a32c-4f21-8c19-8788208c2c3d" (UID: "66e063fb-a32c-4f21-8c19-8788208c2c3d"). InnerVolumeSpecName "kube-api-access-9pl99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.126399 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.126428 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pl99\" (UniqueName: \"kubernetes.io/projected/66e063fb-a32c-4f21-8c19-8788208c2c3d-kube-api-access-9pl99\") on node \"crc\" DevicePath \"\"" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.470476 4732 generic.go:334] "Generic (PLEG): container finished" podID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerID="cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba" exitCode=0 Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.470544 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jfdsp" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.470538 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfdsp" event={"ID":"66e063fb-a32c-4f21-8c19-8788208c2c3d","Type":"ContainerDied","Data":"cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba"} Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.470615 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jfdsp" event={"ID":"66e063fb-a32c-4f21-8c19-8788208c2c3d","Type":"ContainerDied","Data":"51c7c6f778ee34f2d7e7579d0834173a028d4ad96d1e9c95a4505ba9038fcf26"} Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.470651 4732 scope.go:117] "RemoveContainer" containerID="cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.491745 4732 scope.go:117] "RemoveContainer" containerID="67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.516340 4732 scope.go:117] "RemoveContainer" containerID="df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.553335 4732 scope.go:117] "RemoveContainer" containerID="cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba" Oct 10 07:25:38 crc kubenswrapper[4732]: E1010 07:25:38.553853 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba\": container with ID starting with cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba not found: ID does not exist" containerID="cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.553904 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba"} err="failed to get container status \"cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba\": rpc error: code = NotFound desc = could not find container \"cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba\": container with ID starting with cab45d5042f034ab73f34b1c569d9e323b6107ee9232d3dc52bcc2fbc472a1ba not found: ID does not exist" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.553940 4732 scope.go:117] "RemoveContainer" containerID="67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc" Oct 10 07:25:38 crc kubenswrapper[4732]: E1010 07:25:38.554379 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc\": container with ID starting with 67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc not found: ID does not exist" containerID="67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.554430 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc"} err="failed to get container status \"67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc\": rpc error: code = NotFound desc = could not find container \"67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc\": container with ID starting with 67d65b1e316cd8edda8fe555448651a512c77badcb5db3566da5073ce60f49cc not found: ID does not exist" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.554464 4732 scope.go:117] "RemoveContainer" containerID="df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19" Oct 10 07:25:38 crc kubenswrapper[4732]: E1010 07:25:38.554879 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19\": container with ID starting with df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19 not found: ID does not exist" containerID="df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.554914 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19"} err="failed to get container status \"df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19\": rpc error: code = NotFound desc = could not find container \"df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19\": container with ID starting with df17fa48aeb4c159009f19e01907e274fc60f1c5bbd59e562546435b5007bc19 not found: ID does not exist" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.910038 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66e063fb-a32c-4f21-8c19-8788208c2c3d" (UID: "66e063fb-a32c-4f21-8c19-8788208c2c3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:25:38 crc kubenswrapper[4732]: I1010 07:25:38.939155 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66e063fb-a32c-4f21-8c19-8788208c2c3d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:25:39 crc kubenswrapper[4732]: I1010 07:25:39.117314 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jfdsp"] Oct 10 07:25:39 crc kubenswrapper[4732]: I1010 07:25:39.123750 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jfdsp"] Oct 10 07:25:39 crc kubenswrapper[4732]: I1010 07:25:39.675204 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" path="/var/lib/kubelet/pods/66e063fb-a32c-4f21-8c19-8788208c2c3d/volumes" Oct 10 07:26:25 crc kubenswrapper[4732]: I1010 07:26:25.355641 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:26:25 crc kubenswrapper[4732]: I1010 07:26:25.356256 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.321148 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4xf4s"] Oct 10 07:26:40 crc kubenswrapper[4732]: E1010 07:26:40.321978 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="extract-utilities" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.321994 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="extract-utilities" Oct 10 07:26:40 crc kubenswrapper[4732]: E1010 07:26:40.322012 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="extract-content" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.322021 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="extract-content" Oct 10 07:26:40 crc kubenswrapper[4732]: E1010 07:26:40.322042 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="registry-server" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.322050 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="registry-server" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.322266 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="66e063fb-a32c-4f21-8c19-8788208c2c3d" containerName="registry-server" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.323515 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.340228 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xf4s"] Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.431970 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-catalog-content\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.432331 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6jfs\" (UniqueName: \"kubernetes.io/projected/146da698-2eb1-4356-96f5-de475ebd967c-kube-api-access-c6jfs\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.432533 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-utilities\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.534151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-utilities\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.534459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-catalog-content\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.534569 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6jfs\" (UniqueName: \"kubernetes.io/projected/146da698-2eb1-4356-96f5-de475ebd967c-kube-api-access-c6jfs\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.534762 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-utilities\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.535046 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-catalog-content\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.555241 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6jfs\" (UniqueName: \"kubernetes.io/projected/146da698-2eb1-4356-96f5-de475ebd967c-kube-api-access-c6jfs\") pod \"redhat-marketplace-4xf4s\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:40 crc kubenswrapper[4732]: I1010 07:26:40.659787 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:41 crc kubenswrapper[4732]: I1010 07:26:41.101409 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xf4s"] Oct 10 07:26:42 crc kubenswrapper[4732]: I1010 07:26:42.093608 4732 generic.go:334] "Generic (PLEG): container finished" podID="146da698-2eb1-4356-96f5-de475ebd967c" containerID="23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c" exitCode=0 Oct 10 07:26:42 crc kubenswrapper[4732]: I1010 07:26:42.093684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xf4s" event={"ID":"146da698-2eb1-4356-96f5-de475ebd967c","Type":"ContainerDied","Data":"23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c"} Oct 10 07:26:42 crc kubenswrapper[4732]: I1010 07:26:42.094182 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xf4s" event={"ID":"146da698-2eb1-4356-96f5-de475ebd967c","Type":"ContainerStarted","Data":"68e1615a8fde77331ec1601b19cdc685d16cece8998179259c333e6bfda6cd6d"} Oct 10 07:26:42 crc kubenswrapper[4732]: I1010 07:26:42.096196 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:26:43 crc kubenswrapper[4732]: I1010 07:26:43.104214 4732 generic.go:334] "Generic (PLEG): container finished" podID="146da698-2eb1-4356-96f5-de475ebd967c" containerID="e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731" exitCode=0 Oct 10 07:26:43 crc kubenswrapper[4732]: I1010 07:26:43.104263 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xf4s" event={"ID":"146da698-2eb1-4356-96f5-de475ebd967c","Type":"ContainerDied","Data":"e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731"} Oct 10 07:26:44 crc kubenswrapper[4732]: I1010 07:26:44.116077 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xf4s" event={"ID":"146da698-2eb1-4356-96f5-de475ebd967c","Type":"ContainerStarted","Data":"8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b"} Oct 10 07:26:44 crc kubenswrapper[4732]: I1010 07:26:44.149920 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4xf4s" podStartSLOduration=2.737292223 podStartE2EDuration="4.149888098s" podCreationTimestamp="2025-10-10 07:26:40 +0000 UTC" firstStartedPulling="2025-10-10 07:26:42.095960921 +0000 UTC m=+2129.165552162" lastFinishedPulling="2025-10-10 07:26:43.508556756 +0000 UTC m=+2130.578148037" observedRunningTime="2025-10-10 07:26:44.140953224 +0000 UTC m=+2131.210544515" watchObservedRunningTime="2025-10-10 07:26:44.149888098 +0000 UTC m=+2131.219479379" Oct 10 07:26:50 crc kubenswrapper[4732]: I1010 07:26:50.659959 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:50 crc kubenswrapper[4732]: I1010 07:26:50.660964 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:50 crc kubenswrapper[4732]: I1010 07:26:50.722675 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:51 crc kubenswrapper[4732]: I1010 07:26:51.225536 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:51 crc kubenswrapper[4732]: I1010 07:26:51.288514 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xf4s"] Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.197886 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4xf4s" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="registry-server" containerID="cri-o://8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b" gracePeriod=2 Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.373281 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vmtv5"] Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.375137 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.390438 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmtv5"] Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.522477 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8wr\" (UniqueName: \"kubernetes.io/projected/a19d643e-adc3-4433-8b6c-09eb9efb753d-kube-api-access-fs8wr\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.522568 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-utilities\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.522606 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-catalog-content\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.626123 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8wr\" (UniqueName: \"kubernetes.io/projected/a19d643e-adc3-4433-8b6c-09eb9efb753d-kube-api-access-fs8wr\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.626201 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-utilities\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.626236 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-catalog-content\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.626806 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-catalog-content\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.626908 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-utilities\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.649279 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8wr\" (UniqueName: \"kubernetes.io/projected/a19d643e-adc3-4433-8b6c-09eb9efb753d-kube-api-access-fs8wr\") pod \"certified-operators-vmtv5\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.694803 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.697844 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.727195 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-catalog-content\") pod \"146da698-2eb1-4356-96f5-de475ebd967c\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.727315 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-utilities\") pod \"146da698-2eb1-4356-96f5-de475ebd967c\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.727349 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6jfs\" (UniqueName: \"kubernetes.io/projected/146da698-2eb1-4356-96f5-de475ebd967c-kube-api-access-c6jfs\") pod \"146da698-2eb1-4356-96f5-de475ebd967c\" (UID: \"146da698-2eb1-4356-96f5-de475ebd967c\") " Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.729152 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-utilities" (OuterVolumeSpecName: "utilities") pod "146da698-2eb1-4356-96f5-de475ebd967c" (UID: "146da698-2eb1-4356-96f5-de475ebd967c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.735918 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146da698-2eb1-4356-96f5-de475ebd967c-kube-api-access-c6jfs" (OuterVolumeSpecName: "kube-api-access-c6jfs") pod "146da698-2eb1-4356-96f5-de475ebd967c" (UID: "146da698-2eb1-4356-96f5-de475ebd967c"). InnerVolumeSpecName "kube-api-access-c6jfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.749766 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "146da698-2eb1-4356-96f5-de475ebd967c" (UID: "146da698-2eb1-4356-96f5-de475ebd967c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.828755 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.828794 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6jfs\" (UniqueName: \"kubernetes.io/projected/146da698-2eb1-4356-96f5-de475ebd967c-kube-api-access-c6jfs\") on node \"crc\" DevicePath \"\"" Oct 10 07:26:53 crc kubenswrapper[4732]: I1010 07:26:53.828817 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/146da698-2eb1-4356-96f5-de475ebd967c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.207267 4732 generic.go:334] "Generic (PLEG): container finished" podID="146da698-2eb1-4356-96f5-de475ebd967c" containerID="8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b" exitCode=0 Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.207359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xf4s" event={"ID":"146da698-2eb1-4356-96f5-de475ebd967c","Type":"ContainerDied","Data":"8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b"} Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.207668 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xf4s" event={"ID":"146da698-2eb1-4356-96f5-de475ebd967c","Type":"ContainerDied","Data":"68e1615a8fde77331ec1601b19cdc685d16cece8998179259c333e6bfda6cd6d"} Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.207714 4732 scope.go:117] "RemoveContainer" containerID="8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.207402 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xf4s" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.218787 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmtv5"] Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.235845 4732 scope.go:117] "RemoveContainer" containerID="e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.256588 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xf4s"] Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.266491 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xf4s"] Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.279442 4732 scope.go:117] "RemoveContainer" containerID="23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.304437 4732 scope.go:117] "RemoveContainer" containerID="8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b" Oct 10 07:26:54 crc kubenswrapper[4732]: E1010 07:26:54.305196 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b\": container with ID starting with 8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b not found: ID does not exist" containerID="8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.305234 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b"} err="failed to get container status \"8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b\": rpc error: code = NotFound desc = could not find container \"8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b\": container with ID starting with 8cd3bd4f96f76a46c4266b1d14107bc3231d7e5bc1afb5d830c2cdb3214d367b not found: ID does not exist" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.305259 4732 scope.go:117] "RemoveContainer" containerID="e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731" Oct 10 07:26:54 crc kubenswrapper[4732]: E1010 07:26:54.305640 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731\": container with ID starting with e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731 not found: ID does not exist" containerID="e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.306206 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731"} err="failed to get container status \"e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731\": rpc error: code = NotFound desc = could not find container \"e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731\": container with ID starting with e32a7539b59231737c8da36e4ce3b5ea2c6393bd24335ad3e1aa8f744aa50731 not found: ID does not exist" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.306235 4732 scope.go:117] "RemoveContainer" containerID="23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c" Oct 10 07:26:54 crc kubenswrapper[4732]: E1010 07:26:54.306613 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c\": container with ID starting with 23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c not found: ID does not exist" containerID="23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c" Oct 10 07:26:54 crc kubenswrapper[4732]: I1010 07:26:54.306633 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c"} err="failed to get container status \"23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c\": rpc error: code = NotFound desc = could not find container \"23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c\": container with ID starting with 23de47c73ac7af5da280532aaffa050288cb1c72e0657aa2b56f99c54d46bc0c not found: ID does not exist" Oct 10 07:26:55 crc kubenswrapper[4732]: I1010 07:26:55.225277 4732 generic.go:334] "Generic (PLEG): container finished" podID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerID="a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b" exitCode=0 Oct 10 07:26:55 crc kubenswrapper[4732]: I1010 07:26:55.225616 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmtv5" event={"ID":"a19d643e-adc3-4433-8b6c-09eb9efb753d","Type":"ContainerDied","Data":"a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b"} Oct 10 07:26:55 crc kubenswrapper[4732]: I1010 07:26:55.225929 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmtv5" event={"ID":"a19d643e-adc3-4433-8b6c-09eb9efb753d","Type":"ContainerStarted","Data":"d751aa79bed7aba52a0acad6bae79d7cf949617b716a6376fc39e9edd0a99c17"} Oct 10 07:26:55 crc kubenswrapper[4732]: I1010 07:26:55.356795 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:26:55 crc kubenswrapper[4732]: I1010 07:26:55.356906 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:26:55 crc kubenswrapper[4732]: I1010 07:26:55.677212 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146da698-2eb1-4356-96f5-de475ebd967c" path="/var/lib/kubelet/pods/146da698-2eb1-4356-96f5-de475ebd967c/volumes" Oct 10 07:26:57 crc kubenswrapper[4732]: I1010 07:26:57.251447 4732 generic.go:334] "Generic (PLEG): container finished" podID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerID="cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163" exitCode=0 Oct 10 07:26:57 crc kubenswrapper[4732]: I1010 07:26:57.251567 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmtv5" event={"ID":"a19d643e-adc3-4433-8b6c-09eb9efb753d","Type":"ContainerDied","Data":"cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163"} Oct 10 07:26:58 crc kubenswrapper[4732]: I1010 07:26:58.268353 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmtv5" event={"ID":"a19d643e-adc3-4433-8b6c-09eb9efb753d","Type":"ContainerStarted","Data":"2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59"} Oct 10 07:26:58 crc kubenswrapper[4732]: I1010 07:26:58.294188 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vmtv5" podStartSLOduration=2.688266911 podStartE2EDuration="5.294165365s" podCreationTimestamp="2025-10-10 07:26:53 +0000 UTC" firstStartedPulling="2025-10-10 07:26:55.232437909 +0000 UTC m=+2142.302029160" lastFinishedPulling="2025-10-10 07:26:57.838336363 +0000 UTC m=+2144.907927614" observedRunningTime="2025-10-10 07:26:58.291547024 +0000 UTC m=+2145.361138345" watchObservedRunningTime="2025-10-10 07:26:58.294165365 +0000 UTC m=+2145.363756616" Oct 10 07:27:03 crc kubenswrapper[4732]: I1010 07:27:03.695353 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:27:03 crc kubenswrapper[4732]: I1010 07:27:03.695932 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:27:03 crc kubenswrapper[4732]: I1010 07:27:03.753388 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:27:04 crc kubenswrapper[4732]: I1010 07:27:04.390098 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:27:04 crc kubenswrapper[4732]: I1010 07:27:04.448153 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmtv5"] Oct 10 07:27:06 crc kubenswrapper[4732]: I1010 07:27:06.361011 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vmtv5" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="registry-server" containerID="cri-o://2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59" gracePeriod=2 Oct 10 07:27:06 crc kubenswrapper[4732]: I1010 07:27:06.886239 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.025464 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-utilities\") pod \"a19d643e-adc3-4433-8b6c-09eb9efb753d\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.025544 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-catalog-content\") pod \"a19d643e-adc3-4433-8b6c-09eb9efb753d\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.025633 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8wr\" (UniqueName: \"kubernetes.io/projected/a19d643e-adc3-4433-8b6c-09eb9efb753d-kube-api-access-fs8wr\") pod \"a19d643e-adc3-4433-8b6c-09eb9efb753d\" (UID: \"a19d643e-adc3-4433-8b6c-09eb9efb753d\") " Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.027193 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-utilities" (OuterVolumeSpecName: "utilities") pod "a19d643e-adc3-4433-8b6c-09eb9efb753d" (UID: "a19d643e-adc3-4433-8b6c-09eb9efb753d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.035940 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19d643e-adc3-4433-8b6c-09eb9efb753d-kube-api-access-fs8wr" (OuterVolumeSpecName: "kube-api-access-fs8wr") pod "a19d643e-adc3-4433-8b6c-09eb9efb753d" (UID: "a19d643e-adc3-4433-8b6c-09eb9efb753d"). InnerVolumeSpecName "kube-api-access-fs8wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.128004 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs8wr\" (UniqueName: \"kubernetes.io/projected/a19d643e-adc3-4433-8b6c-09eb9efb753d-kube-api-access-fs8wr\") on node \"crc\" DevicePath \"\"" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.128055 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.192086 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a19d643e-adc3-4433-8b6c-09eb9efb753d" (UID: "a19d643e-adc3-4433-8b6c-09eb9efb753d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.229893 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a19d643e-adc3-4433-8b6c-09eb9efb753d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.385520 4732 generic.go:334] "Generic (PLEG): container finished" podID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerID="2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59" exitCode=0 Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.385625 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmtv5" event={"ID":"a19d643e-adc3-4433-8b6c-09eb9efb753d","Type":"ContainerDied","Data":"2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59"} Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.385687 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmtv5" event={"ID":"a19d643e-adc3-4433-8b6c-09eb9efb753d","Type":"ContainerDied","Data":"d751aa79bed7aba52a0acad6bae79d7cf949617b716a6376fc39e9edd0a99c17"} Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.385792 4732 scope.go:117] "RemoveContainer" containerID="2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.386158 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmtv5" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.419401 4732 scope.go:117] "RemoveContainer" containerID="cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.444963 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmtv5"] Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.453832 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vmtv5"] Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.458113 4732 scope.go:117] "RemoveContainer" containerID="a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.495065 4732 scope.go:117] "RemoveContainer" containerID="2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59" Oct 10 07:27:07 crc kubenswrapper[4732]: E1010 07:27:07.495645 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59\": container with ID starting with 2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59 not found: ID does not exist" containerID="2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.495704 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59"} err="failed to get container status \"2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59\": rpc error: code = NotFound desc = could not find container \"2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59\": container with ID starting with 2d22bda2b5581ec9e591fcc1e8a65b66c8ce009d9c52a6530b04baca9f0dcd59 not found: ID does not exist" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.495747 4732 scope.go:117] "RemoveContainer" containerID="cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163" Oct 10 07:27:07 crc kubenswrapper[4732]: E1010 07:27:07.496196 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163\": container with ID starting with cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163 not found: ID does not exist" containerID="cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.496228 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163"} err="failed to get container status \"cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163\": rpc error: code = NotFound desc = could not find container \"cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163\": container with ID starting with cd1e8ca2f5d388242d6ac49852c1c1f8945bef55ab7e7321bac69de86577f163 not found: ID does not exist" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.496246 4732 scope.go:117] "RemoveContainer" containerID="a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b" Oct 10 07:27:07 crc kubenswrapper[4732]: E1010 07:27:07.496590 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b\": container with ID starting with a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b not found: ID does not exist" containerID="a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.496653 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b"} err="failed to get container status \"a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b\": rpc error: code = NotFound desc = could not find container \"a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b\": container with ID starting with a3f24f0b6513aee2248981df51a16e0cdad63e260119c10945b107b3cbd8d36b not found: ID does not exist" Oct 10 07:27:07 crc kubenswrapper[4732]: I1010 07:27:07.671164 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" path="/var/lib/kubelet/pods/a19d643e-adc3-4433-8b6c-09eb9efb753d/volumes" Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.356593 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.358005 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.358096 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.359077 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.359201 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" gracePeriod=600 Oct 10 07:27:25 crc kubenswrapper[4732]: E1010 07:27:25.483984 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.608050 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" exitCode=0 Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.608132 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713"} Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.608199 4732 scope.go:117] "RemoveContainer" containerID="47ff4480f036b42a35a03b5ca25dbd459d4294026c75ade3a4233c6e6dc96947" Oct 10 07:27:25 crc kubenswrapper[4732]: I1010 07:27:25.609068 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:27:25 crc kubenswrapper[4732]: E1010 07:27:25.609528 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:27:39 crc kubenswrapper[4732]: I1010 07:27:39.663635 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:27:39 crc kubenswrapper[4732]: E1010 07:27:39.664658 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:27:54 crc kubenswrapper[4732]: I1010 07:27:54.660553 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:27:54 crc kubenswrapper[4732]: E1010 07:27:54.661535 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:28:09 crc kubenswrapper[4732]: I1010 07:28:09.661507 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:28:09 crc kubenswrapper[4732]: E1010 07:28:09.662761 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:28:21 crc kubenswrapper[4732]: I1010 07:28:21.660620 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:28:21 crc kubenswrapper[4732]: E1010 07:28:21.661610 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:28:33 crc kubenswrapper[4732]: I1010 07:28:33.668623 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:28:33 crc kubenswrapper[4732]: E1010 07:28:33.669850 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:28:45 crc kubenswrapper[4732]: I1010 07:28:45.661661 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:28:45 crc kubenswrapper[4732]: E1010 07:28:45.662529 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:28:59 crc kubenswrapper[4732]: I1010 07:28:59.665597 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:28:59 crc kubenswrapper[4732]: E1010 07:28:59.666813 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:29:11 crc kubenswrapper[4732]: I1010 07:29:11.660214 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:29:11 crc kubenswrapper[4732]: E1010 07:29:11.661181 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:29:26 crc kubenswrapper[4732]: I1010 07:29:26.660146 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:29:26 crc kubenswrapper[4732]: E1010 07:29:26.661169 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:29:39 crc kubenswrapper[4732]: I1010 07:29:39.660609 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:29:39 crc kubenswrapper[4732]: E1010 07:29:39.661589 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:29:52 crc kubenswrapper[4732]: I1010 07:29:52.659707 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:29:52 crc kubenswrapper[4732]: E1010 07:29:52.661417 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.165303 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6"] Oct 10 07:30:00 crc kubenswrapper[4732]: E1010 07:30:00.166354 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166376 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4732]: E1010 07:30:00.166401 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="extract-utilities" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166414 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="extract-utilities" Oct 10 07:30:00 crc kubenswrapper[4732]: E1010 07:30:00.166438 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="extract-content" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166447 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="extract-content" Oct 10 07:30:00 crc kubenswrapper[4732]: E1010 07:30:00.166472 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="extract-content" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166480 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="extract-content" Oct 10 07:30:00 crc kubenswrapper[4732]: E1010 07:30:00.166495 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166503 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4732]: E1010 07:30:00.166520 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="extract-utilities" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166528 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="extract-utilities" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166748 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="146da698-2eb1-4356-96f5-de475ebd967c" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.166765 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19d643e-adc3-4433-8b6c-09eb9efb753d" containerName="registry-server" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.167572 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.170036 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.170261 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.191211 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6"] Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.295972 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2965b0ef-afde-4345-8b09-d1e84cbdf542-secret-volume\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.296131 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd7fw\" (UniqueName: \"kubernetes.io/projected/2965b0ef-afde-4345-8b09-d1e84cbdf542-kube-api-access-jd7fw\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.296331 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2965b0ef-afde-4345-8b09-d1e84cbdf542-config-volume\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.397806 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd7fw\" (UniqueName: \"kubernetes.io/projected/2965b0ef-afde-4345-8b09-d1e84cbdf542-kube-api-access-jd7fw\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.397899 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2965b0ef-afde-4345-8b09-d1e84cbdf542-config-volume\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.397933 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2965b0ef-afde-4345-8b09-d1e84cbdf542-secret-volume\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.399102 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2965b0ef-afde-4345-8b09-d1e84cbdf542-config-volume\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.406141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2965b0ef-afde-4345-8b09-d1e84cbdf542-secret-volume\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.426619 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd7fw\" (UniqueName: \"kubernetes.io/projected/2965b0ef-afde-4345-8b09-d1e84cbdf542-kube-api-access-jd7fw\") pod \"collect-profiles-29334690-zzsp6\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.501943 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:00 crc kubenswrapper[4732]: I1010 07:30:00.942418 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6"] Oct 10 07:30:01 crc kubenswrapper[4732]: I1010 07:30:01.027493 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" event={"ID":"2965b0ef-afde-4345-8b09-d1e84cbdf542","Type":"ContainerStarted","Data":"0b4c0c3941fba6d2eceb6515f52024332e93664ed2807aa5d1d610f3e7d075e3"} Oct 10 07:30:02 crc kubenswrapper[4732]: I1010 07:30:02.039120 4732 generic.go:334] "Generic (PLEG): container finished" podID="2965b0ef-afde-4345-8b09-d1e84cbdf542" containerID="41b83a38c9c7e348505e3e383607118be28f16cc37539af71c899957b698a648" exitCode=0 Oct 10 07:30:02 crc kubenswrapper[4732]: I1010 07:30:02.039204 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" event={"ID":"2965b0ef-afde-4345-8b09-d1e84cbdf542","Type":"ContainerDied","Data":"41b83a38c9c7e348505e3e383607118be28f16cc37539af71c899957b698a648"} Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.327618 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.439038 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd7fw\" (UniqueName: \"kubernetes.io/projected/2965b0ef-afde-4345-8b09-d1e84cbdf542-kube-api-access-jd7fw\") pod \"2965b0ef-afde-4345-8b09-d1e84cbdf542\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.439174 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2965b0ef-afde-4345-8b09-d1e84cbdf542-secret-volume\") pod \"2965b0ef-afde-4345-8b09-d1e84cbdf542\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.439208 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2965b0ef-afde-4345-8b09-d1e84cbdf542-config-volume\") pod \"2965b0ef-afde-4345-8b09-d1e84cbdf542\" (UID: \"2965b0ef-afde-4345-8b09-d1e84cbdf542\") " Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.439842 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2965b0ef-afde-4345-8b09-d1e84cbdf542-config-volume" (OuterVolumeSpecName: "config-volume") pod "2965b0ef-afde-4345-8b09-d1e84cbdf542" (UID: "2965b0ef-afde-4345-8b09-d1e84cbdf542"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.440041 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2965b0ef-afde-4345-8b09-d1e84cbdf542-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.444062 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2965b0ef-afde-4345-8b09-d1e84cbdf542-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2965b0ef-afde-4345-8b09-d1e84cbdf542" (UID: "2965b0ef-afde-4345-8b09-d1e84cbdf542"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.445551 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2965b0ef-afde-4345-8b09-d1e84cbdf542-kube-api-access-jd7fw" (OuterVolumeSpecName: "kube-api-access-jd7fw") pod "2965b0ef-afde-4345-8b09-d1e84cbdf542" (UID: "2965b0ef-afde-4345-8b09-d1e84cbdf542"). InnerVolumeSpecName "kube-api-access-jd7fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.541421 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd7fw\" (UniqueName: \"kubernetes.io/projected/2965b0ef-afde-4345-8b09-d1e84cbdf542-kube-api-access-jd7fw\") on node \"crc\" DevicePath \"\"" Oct 10 07:30:03 crc kubenswrapper[4732]: I1010 07:30:03.541461 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2965b0ef-afde-4345-8b09-d1e84cbdf542-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:30:04 crc kubenswrapper[4732]: I1010 07:30:04.058828 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" event={"ID":"2965b0ef-afde-4345-8b09-d1e84cbdf542","Type":"ContainerDied","Data":"0b4c0c3941fba6d2eceb6515f52024332e93664ed2807aa5d1d610f3e7d075e3"} Oct 10 07:30:04 crc kubenswrapper[4732]: I1010 07:30:04.058878 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b4c0c3941fba6d2eceb6515f52024332e93664ed2807aa5d1d610f3e7d075e3" Oct 10 07:30:04 crc kubenswrapper[4732]: I1010 07:30:04.058887 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6" Oct 10 07:30:04 crc kubenswrapper[4732]: I1010 07:30:04.399491 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5"] Oct 10 07:30:04 crc kubenswrapper[4732]: I1010 07:30:04.405066 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334645-j4gb5"] Oct 10 07:30:05 crc kubenswrapper[4732]: I1010 07:30:05.670675 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e4bae7-5083-477d-ac35-4ab579a104ba" path="/var/lib/kubelet/pods/76e4bae7-5083-477d-ac35-4ab579a104ba/volumes" Oct 10 07:30:06 crc kubenswrapper[4732]: I1010 07:30:06.660748 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:30:06 crc kubenswrapper[4732]: E1010 07:30:06.661148 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:30:19 crc kubenswrapper[4732]: I1010 07:30:19.660872 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:30:19 crc kubenswrapper[4732]: E1010 07:30:19.662575 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:30:23 crc kubenswrapper[4732]: I1010 07:30:23.349592 4732 scope.go:117] "RemoveContainer" containerID="c20383127e3250c0fc7ad6bb49927db5918e7898b10684cba3946b15ea14a473" Oct 10 07:30:30 crc kubenswrapper[4732]: I1010 07:30:30.661302 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:30:30 crc kubenswrapper[4732]: E1010 07:30:30.662358 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:30:43 crc kubenswrapper[4732]: I1010 07:30:43.668613 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:30:43 crc kubenswrapper[4732]: E1010 07:30:43.669587 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:30:56 crc kubenswrapper[4732]: I1010 07:30:56.661507 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:30:56 crc kubenswrapper[4732]: E1010 07:30:56.662632 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:31:09 crc kubenswrapper[4732]: I1010 07:31:09.661258 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:31:09 crc kubenswrapper[4732]: E1010 07:31:09.664311 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:31:24 crc kubenswrapper[4732]: I1010 07:31:24.660420 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:31:24 crc kubenswrapper[4732]: E1010 07:31:24.663176 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:31:36 crc kubenswrapper[4732]: I1010 07:31:36.660772 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:31:36 crc kubenswrapper[4732]: E1010 07:31:36.661928 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:31:51 crc kubenswrapper[4732]: I1010 07:31:51.661028 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:31:51 crc kubenswrapper[4732]: E1010 07:31:51.662072 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:31:52 crc kubenswrapper[4732]: I1010 07:31:52.864157 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-krqg8"] Oct 10 07:31:52 crc kubenswrapper[4732]: E1010 07:31:52.866204 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2965b0ef-afde-4345-8b09-d1e84cbdf542" containerName="collect-profiles" Oct 10 07:31:52 crc kubenswrapper[4732]: I1010 07:31:52.866252 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2965b0ef-afde-4345-8b09-d1e84cbdf542" containerName="collect-profiles" Oct 10 07:31:52 crc kubenswrapper[4732]: I1010 07:31:52.866605 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2965b0ef-afde-4345-8b09-d1e84cbdf542" containerName="collect-profiles" Oct 10 07:31:52 crc kubenswrapper[4732]: I1010 07:31:52.869235 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:52 crc kubenswrapper[4732]: I1010 07:31:52.876022 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krqg8"] Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.000215 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjkwz\" (UniqueName: \"kubernetes.io/projected/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-kube-api-access-bjkwz\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.000345 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-utilities\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.000409 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-catalog-content\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.101829 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-catalog-content\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.101958 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjkwz\" (UniqueName: \"kubernetes.io/projected/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-kube-api-access-bjkwz\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.102004 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-utilities\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.102467 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-catalog-content\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.103108 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-utilities\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.122191 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjkwz\" (UniqueName: \"kubernetes.io/projected/01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e-kube-api-access-bjkwz\") pod \"community-operators-krqg8\" (UID: \"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e\") " pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.195203 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:31:53 crc kubenswrapper[4732]: I1010 07:31:53.729897 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krqg8"] Oct 10 07:31:54 crc kubenswrapper[4732]: I1010 07:31:54.052829 4732 generic.go:334] "Generic (PLEG): container finished" podID="01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e" containerID="2752f53b869ddfd1e618ff2461859080a4f951b0a4ffda309b8982bf19424980" exitCode=0 Oct 10 07:31:54 crc kubenswrapper[4732]: I1010 07:31:54.052886 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqg8" event={"ID":"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e","Type":"ContainerDied","Data":"2752f53b869ddfd1e618ff2461859080a4f951b0a4ffda309b8982bf19424980"} Oct 10 07:31:54 crc kubenswrapper[4732]: I1010 07:31:54.052920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqg8" event={"ID":"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e","Type":"ContainerStarted","Data":"4da771260b32198833397e7ea67b1259bb9f52e1bafe857393e77d764654a0e6"} Oct 10 07:31:54 crc kubenswrapper[4732]: I1010 07:31:54.056489 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:31:58 crc kubenswrapper[4732]: I1010 07:31:58.135446 4732 generic.go:334] "Generic (PLEG): container finished" podID="01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e" containerID="3202ec432e16785286e7e9f41b5332bfb67906ef1f0673450ce9cebfb2d60307" exitCode=0 Oct 10 07:31:58 crc kubenswrapper[4732]: I1010 07:31:58.135544 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqg8" event={"ID":"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e","Type":"ContainerDied","Data":"3202ec432e16785286e7e9f41b5332bfb67906ef1f0673450ce9cebfb2d60307"} Oct 10 07:31:59 crc kubenswrapper[4732]: I1010 07:31:59.145237 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-krqg8" event={"ID":"01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e","Type":"ContainerStarted","Data":"73ffda42f3a914904ffeb316f0d89062de5d5ae32fd5102c1376bc6acba4533d"} Oct 10 07:31:59 crc kubenswrapper[4732]: I1010 07:31:59.161895 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-krqg8" podStartSLOduration=2.687509907 podStartE2EDuration="7.161874645s" podCreationTimestamp="2025-10-10 07:31:52 +0000 UTC" firstStartedPulling="2025-10-10 07:31:54.056141623 +0000 UTC m=+2441.125732874" lastFinishedPulling="2025-10-10 07:31:58.530506331 +0000 UTC m=+2445.600097612" observedRunningTime="2025-10-10 07:31:59.159816779 +0000 UTC m=+2446.229408040" watchObservedRunningTime="2025-10-10 07:31:59.161874645 +0000 UTC m=+2446.231465896" Oct 10 07:32:03 crc kubenswrapper[4732]: I1010 07:32:03.196343 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:32:03 crc kubenswrapper[4732]: I1010 07:32:03.198782 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:32:03 crc kubenswrapper[4732]: I1010 07:32:03.283515 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.230594 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-krqg8" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.287260 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-krqg8"] Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.325753 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxg85"] Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.326032 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxg85" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="registry-server" containerID="cri-o://d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136" gracePeriod=2 Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.710597 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxg85" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.778078 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86kxw\" (UniqueName: \"kubernetes.io/projected/560dd02b-9de9-4489-b95c-b039bcd21e3e-kube-api-access-86kxw\") pod \"560dd02b-9de9-4489-b95c-b039bcd21e3e\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.778231 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-catalog-content\") pod \"560dd02b-9de9-4489-b95c-b039bcd21e3e\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.778259 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-utilities\") pod \"560dd02b-9de9-4489-b95c-b039bcd21e3e\" (UID: \"560dd02b-9de9-4489-b95c-b039bcd21e3e\") " Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.779946 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-utilities" (OuterVolumeSpecName: "utilities") pod "560dd02b-9de9-4489-b95c-b039bcd21e3e" (UID: "560dd02b-9de9-4489-b95c-b039bcd21e3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.801993 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560dd02b-9de9-4489-b95c-b039bcd21e3e-kube-api-access-86kxw" (OuterVolumeSpecName: "kube-api-access-86kxw") pod "560dd02b-9de9-4489-b95c-b039bcd21e3e" (UID: "560dd02b-9de9-4489-b95c-b039bcd21e3e"). InnerVolumeSpecName "kube-api-access-86kxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.836544 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "560dd02b-9de9-4489-b95c-b039bcd21e3e" (UID: "560dd02b-9de9-4489-b95c-b039bcd21e3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.880290 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86kxw\" (UniqueName: \"kubernetes.io/projected/560dd02b-9de9-4489-b95c-b039bcd21e3e-kube-api-access-86kxw\") on node \"crc\" DevicePath \"\"" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.880322 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:32:04 crc kubenswrapper[4732]: I1010 07:32:04.880332 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560dd02b-9de9-4489-b95c-b039bcd21e3e-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.193713 4732 generic.go:334] "Generic (PLEG): container finished" podID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerID="d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136" exitCode=0 Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.193751 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxg85" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.193794 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxg85" event={"ID":"560dd02b-9de9-4489-b95c-b039bcd21e3e","Type":"ContainerDied","Data":"d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136"} Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.193854 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxg85" event={"ID":"560dd02b-9de9-4489-b95c-b039bcd21e3e","Type":"ContainerDied","Data":"ced73c561c8593237786ff8c57f84b3557e7f201e14b182defe384481979a994"} Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.193878 4732 scope.go:117] "RemoveContainer" containerID="d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.228350 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxg85"] Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.231678 4732 scope.go:117] "RemoveContainer" containerID="2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.232683 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxg85"] Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.252833 4732 scope.go:117] "RemoveContainer" containerID="fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.278941 4732 scope.go:117] "RemoveContainer" containerID="d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136" Oct 10 07:32:05 crc kubenswrapper[4732]: E1010 07:32:05.280996 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136\": container with ID starting with d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136 not found: ID does not exist" containerID="d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.281036 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136"} err="failed to get container status \"d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136\": rpc error: code = NotFound desc = could not find container \"d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136\": container with ID starting with d551be3780c4e641abfb721312bd717b31a91a854adc9dc4816b4faa5ed5c136 not found: ID does not exist" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.281061 4732 scope.go:117] "RemoveContainer" containerID="2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8" Oct 10 07:32:05 crc kubenswrapper[4732]: E1010 07:32:05.281355 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8\": container with ID starting with 2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8 not found: ID does not exist" containerID="2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.281411 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8"} err="failed to get container status \"2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8\": rpc error: code = NotFound desc = could not find container \"2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8\": container with ID starting with 2311fce002f9a9b508a417a81a4a428a7c5b28bf5f9b49a36660a6f80a3598b8 not found: ID does not exist" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.281445 4732 scope.go:117] "RemoveContainer" containerID="fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e" Oct 10 07:32:05 crc kubenswrapper[4732]: E1010 07:32:05.281709 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e\": container with ID starting with fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e not found: ID does not exist" containerID="fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.281735 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e"} err="failed to get container status \"fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e\": rpc error: code = NotFound desc = could not find container \"fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e\": container with ID starting with fc9ffaa60ad9e2c6e00b454527e509cc1c1a3f0dba675c03b4f142a8598ad77e not found: ID does not exist" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.661824 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:32:05 crc kubenswrapper[4732]: E1010 07:32:05.662286 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:32:05 crc kubenswrapper[4732]: I1010 07:32:05.671668 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" path="/var/lib/kubelet/pods/560dd02b-9de9-4489-b95c-b039bcd21e3e/volumes" Oct 10 07:32:18 crc kubenswrapper[4732]: I1010 07:32:18.660640 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:32:18 crc kubenswrapper[4732]: E1010 07:32:18.662893 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:32:32 crc kubenswrapper[4732]: I1010 07:32:32.660894 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:32:33 crc kubenswrapper[4732]: I1010 07:32:33.459681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"94813d2c17f53a9748a0e1e81aaa6f40247e7b25f2ff0afbf59ef90707bf81f1"} Oct 10 07:34:55 crc kubenswrapper[4732]: I1010 07:34:55.356683 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:34:55 crc kubenswrapper[4732]: I1010 07:34:55.357287 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:35:25 crc kubenswrapper[4732]: I1010 07:35:25.356626 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:35:25 crc kubenswrapper[4732]: I1010 07:35:25.357950 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.040933 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2xmv4"] Oct 10 07:35:34 crc kubenswrapper[4732]: E1010 07:35:34.041681 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="extract-utilities" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.041718 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="extract-utilities" Oct 10 07:35:34 crc kubenswrapper[4732]: E1010 07:35:34.041747 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="extract-content" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.041753 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="extract-content" Oct 10 07:35:34 crc kubenswrapper[4732]: E1010 07:35:34.041767 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="registry-server" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.041774 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="registry-server" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.041903 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="560dd02b-9de9-4489-b95c-b039bcd21e3e" containerName="registry-server" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.042895 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.061787 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2xmv4"] Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.166453 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-catalog-content\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.167009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdl7\" (UniqueName: \"kubernetes.io/projected/d833f4a1-023e-4ecc-8077-65349a367aae-kube-api-access-mvdl7\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.167165 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-utilities\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.268499 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-utilities\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.268625 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-catalog-content\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.268666 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdl7\" (UniqueName: \"kubernetes.io/projected/d833f4a1-023e-4ecc-8077-65349a367aae-kube-api-access-mvdl7\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.269066 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-utilities\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.269084 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-catalog-content\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.297930 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdl7\" (UniqueName: \"kubernetes.io/projected/d833f4a1-023e-4ecc-8077-65349a367aae-kube-api-access-mvdl7\") pod \"redhat-operators-2xmv4\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.365866 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:34 crc kubenswrapper[4732]: I1010 07:35:34.634580 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2xmv4"] Oct 10 07:35:35 crc kubenswrapper[4732]: I1010 07:35:35.106611 4732 generic.go:334] "Generic (PLEG): container finished" podID="d833f4a1-023e-4ecc-8077-65349a367aae" containerID="909b49cad8f64678cf00c12c409418f7e3ff838411260d34e061af7978c718c0" exitCode=0 Oct 10 07:35:35 crc kubenswrapper[4732]: I1010 07:35:35.106828 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xmv4" event={"ID":"d833f4a1-023e-4ecc-8077-65349a367aae","Type":"ContainerDied","Data":"909b49cad8f64678cf00c12c409418f7e3ff838411260d34e061af7978c718c0"} Oct 10 07:35:35 crc kubenswrapper[4732]: I1010 07:35:35.106917 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xmv4" event={"ID":"d833f4a1-023e-4ecc-8077-65349a367aae","Type":"ContainerStarted","Data":"203aff75792b2a8226b64ffa17556960d46448e996e8ddb8196020cfedcf5c18"} Oct 10 07:35:36 crc kubenswrapper[4732]: I1010 07:35:36.118481 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xmv4" event={"ID":"d833f4a1-023e-4ecc-8077-65349a367aae","Type":"ContainerStarted","Data":"5a654a04dea5cfcdb05d5cb01a8c086dab44f7b5794d67f17aaaae5ade8b0cb8"} Oct 10 07:35:37 crc kubenswrapper[4732]: I1010 07:35:37.135087 4732 generic.go:334] "Generic (PLEG): container finished" podID="d833f4a1-023e-4ecc-8077-65349a367aae" containerID="5a654a04dea5cfcdb05d5cb01a8c086dab44f7b5794d67f17aaaae5ade8b0cb8" exitCode=0 Oct 10 07:35:37 crc kubenswrapper[4732]: I1010 07:35:37.135202 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xmv4" event={"ID":"d833f4a1-023e-4ecc-8077-65349a367aae","Type":"ContainerDied","Data":"5a654a04dea5cfcdb05d5cb01a8c086dab44f7b5794d67f17aaaae5ade8b0cb8"} Oct 10 07:35:38 crc kubenswrapper[4732]: I1010 07:35:38.148078 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xmv4" event={"ID":"d833f4a1-023e-4ecc-8077-65349a367aae","Type":"ContainerStarted","Data":"1ca99dd098281dd80f30c2a4346a83062ce1c2ce5647dc6f0fcddc05f1f1f3ad"} Oct 10 07:35:38 crc kubenswrapper[4732]: I1010 07:35:38.179041 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2xmv4" podStartSLOduration=1.6741615749999998 podStartE2EDuration="4.179014112s" podCreationTimestamp="2025-10-10 07:35:34 +0000 UTC" firstStartedPulling="2025-10-10 07:35:35.108735156 +0000 UTC m=+2662.178326397" lastFinishedPulling="2025-10-10 07:35:37.613587663 +0000 UTC m=+2664.683178934" observedRunningTime="2025-10-10 07:35:38.176457932 +0000 UTC m=+2665.246049173" watchObservedRunningTime="2025-10-10 07:35:38.179014112 +0000 UTC m=+2665.248605393" Oct 10 07:35:44 crc kubenswrapper[4732]: I1010 07:35:44.366885 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:44 crc kubenswrapper[4732]: I1010 07:35:44.368963 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:44 crc kubenswrapper[4732]: I1010 07:35:44.416676 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:45 crc kubenswrapper[4732]: I1010 07:35:45.264135 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:45 crc kubenswrapper[4732]: I1010 07:35:45.315944 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2xmv4"] Oct 10 07:35:47 crc kubenswrapper[4732]: I1010 07:35:47.224517 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2xmv4" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="registry-server" containerID="cri-o://1ca99dd098281dd80f30c2a4346a83062ce1c2ce5647dc6f0fcddc05f1f1f3ad" gracePeriod=2 Oct 10 07:35:49 crc kubenswrapper[4732]: I1010 07:35:49.244211 4732 generic.go:334] "Generic (PLEG): container finished" podID="d833f4a1-023e-4ecc-8077-65349a367aae" containerID="1ca99dd098281dd80f30c2a4346a83062ce1c2ce5647dc6f0fcddc05f1f1f3ad" exitCode=0 Oct 10 07:35:49 crc kubenswrapper[4732]: I1010 07:35:49.244366 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xmv4" event={"ID":"d833f4a1-023e-4ecc-8077-65349a367aae","Type":"ContainerDied","Data":"1ca99dd098281dd80f30c2a4346a83062ce1c2ce5647dc6f0fcddc05f1f1f3ad"} Oct 10 07:35:49 crc kubenswrapper[4732]: I1010 07:35:49.945509 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.050881 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-utilities\") pod \"d833f4a1-023e-4ecc-8077-65349a367aae\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.050955 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-catalog-content\") pod \"d833f4a1-023e-4ecc-8077-65349a367aae\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.051036 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdl7\" (UniqueName: \"kubernetes.io/projected/d833f4a1-023e-4ecc-8077-65349a367aae-kube-api-access-mvdl7\") pod \"d833f4a1-023e-4ecc-8077-65349a367aae\" (UID: \"d833f4a1-023e-4ecc-8077-65349a367aae\") " Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.052803 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-utilities" (OuterVolumeSpecName: "utilities") pod "d833f4a1-023e-4ecc-8077-65349a367aae" (UID: "d833f4a1-023e-4ecc-8077-65349a367aae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.059924 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d833f4a1-023e-4ecc-8077-65349a367aae-kube-api-access-mvdl7" (OuterVolumeSpecName: "kube-api-access-mvdl7") pod "d833f4a1-023e-4ecc-8077-65349a367aae" (UID: "d833f4a1-023e-4ecc-8077-65349a367aae"). InnerVolumeSpecName "kube-api-access-mvdl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.145406 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d833f4a1-023e-4ecc-8077-65349a367aae" (UID: "d833f4a1-023e-4ecc-8077-65349a367aae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.153600 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.153686 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d833f4a1-023e-4ecc-8077-65349a367aae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.153765 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdl7\" (UniqueName: \"kubernetes.io/projected/d833f4a1-023e-4ecc-8077-65349a367aae-kube-api-access-mvdl7\") on node \"crc\" DevicePath \"\"" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.258996 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2xmv4" event={"ID":"d833f4a1-023e-4ecc-8077-65349a367aae","Type":"ContainerDied","Data":"203aff75792b2a8226b64ffa17556960d46448e996e8ddb8196020cfedcf5c18"} Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.259055 4732 scope.go:117] "RemoveContainer" containerID="1ca99dd098281dd80f30c2a4346a83062ce1c2ce5647dc6f0fcddc05f1f1f3ad" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.259094 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2xmv4" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.283274 4732 scope.go:117] "RemoveContainer" containerID="5a654a04dea5cfcdb05d5cb01a8c086dab44f7b5794d67f17aaaae5ade8b0cb8" Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.303361 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2xmv4"] Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.323803 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2xmv4"] Oct 10 07:35:50 crc kubenswrapper[4732]: I1010 07:35:50.330286 4732 scope.go:117] "RemoveContainer" containerID="909b49cad8f64678cf00c12c409418f7e3ff838411260d34e061af7978c718c0" Oct 10 07:35:51 crc kubenswrapper[4732]: I1010 07:35:51.676185 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" path="/var/lib/kubelet/pods/d833f4a1-023e-4ecc-8077-65349a367aae/volumes" Oct 10 07:35:55 crc kubenswrapper[4732]: I1010 07:35:55.356151 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:35:55 crc kubenswrapper[4732]: I1010 07:35:55.356603 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:35:55 crc kubenswrapper[4732]: I1010 07:35:55.356667 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:35:55 crc kubenswrapper[4732]: I1010 07:35:55.357954 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94813d2c17f53a9748a0e1e81aaa6f40247e7b25f2ff0afbf59ef90707bf81f1"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:35:55 crc kubenswrapper[4732]: I1010 07:35:55.358162 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://94813d2c17f53a9748a0e1e81aaa6f40247e7b25f2ff0afbf59ef90707bf81f1" gracePeriod=600 Oct 10 07:35:56 crc kubenswrapper[4732]: I1010 07:35:56.318774 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="94813d2c17f53a9748a0e1e81aaa6f40247e7b25f2ff0afbf59ef90707bf81f1" exitCode=0 Oct 10 07:35:56 crc kubenswrapper[4732]: I1010 07:35:56.318948 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"94813d2c17f53a9748a0e1e81aaa6f40247e7b25f2ff0afbf59ef90707bf81f1"} Oct 10 07:35:56 crc kubenswrapper[4732]: I1010 07:35:56.319257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c"} Oct 10 07:35:56 crc kubenswrapper[4732]: I1010 07:35:56.319294 4732 scope.go:117] "RemoveContainer" containerID="eed0661882f6aa0a63599b425655a5d2c437782eef48558052e511fc6b63d713" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.363541 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hnppr"] Oct 10 07:37:26 crc kubenswrapper[4732]: E1010 07:37:26.364416 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="registry-server" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.364429 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="registry-server" Oct 10 07:37:26 crc kubenswrapper[4732]: E1010 07:37:26.364446 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="extract-utilities" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.364453 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="extract-utilities" Oct 10 07:37:26 crc kubenswrapper[4732]: E1010 07:37:26.364481 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="extract-content" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.364491 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="extract-content" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.364671 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d833f4a1-023e-4ecc-8077-65349a367aae" containerName="registry-server" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.365777 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.391725 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnppr"] Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.528188 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szw9q\" (UniqueName: \"kubernetes.io/projected/249916e1-cd89-4c71-baa3-4a710faf816e-kube-api-access-szw9q\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.528292 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-catalog-content\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.528334 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-utilities\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.629954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szw9q\" (UniqueName: \"kubernetes.io/projected/249916e1-cd89-4c71-baa3-4a710faf816e-kube-api-access-szw9q\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.630035 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-catalog-content\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.630069 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-utilities\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.630514 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-utilities\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.630626 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-catalog-content\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.651664 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szw9q\" (UniqueName: \"kubernetes.io/projected/249916e1-cd89-4c71-baa3-4a710faf816e-kube-api-access-szw9q\") pod \"redhat-marketplace-hnppr\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:26 crc kubenswrapper[4732]: I1010 07:37:26.696428 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:27 crc kubenswrapper[4732]: I1010 07:37:27.111310 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnppr"] Oct 10 07:37:27 crc kubenswrapper[4732]: I1010 07:37:27.206315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnppr" event={"ID":"249916e1-cd89-4c71-baa3-4a710faf816e","Type":"ContainerStarted","Data":"2d6ab21b97bbea34942a93ae3ccaf97b0d94f12a8dce68276495e3284ee4ff36"} Oct 10 07:37:28 crc kubenswrapper[4732]: I1010 07:37:28.220850 4732 generic.go:334] "Generic (PLEG): container finished" podID="249916e1-cd89-4c71-baa3-4a710faf816e" containerID="37182ff6f39452867f977ae80d66a9940f74fd30c4fae9e3ea91b5a25e088765" exitCode=0 Oct 10 07:37:28 crc kubenswrapper[4732]: I1010 07:37:28.221110 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnppr" event={"ID":"249916e1-cd89-4c71-baa3-4a710faf816e","Type":"ContainerDied","Data":"37182ff6f39452867f977ae80d66a9940f74fd30c4fae9e3ea91b5a25e088765"} Oct 10 07:37:28 crc kubenswrapper[4732]: I1010 07:37:28.224835 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:37:30 crc kubenswrapper[4732]: I1010 07:37:30.240994 4732 generic.go:334] "Generic (PLEG): container finished" podID="249916e1-cd89-4c71-baa3-4a710faf816e" containerID="4095e6d1c5297550191a60ab07a8112a56af5bc2a2db2c81c7c79f3620e6d9ab" exitCode=0 Oct 10 07:37:30 crc kubenswrapper[4732]: I1010 07:37:30.241112 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnppr" event={"ID":"249916e1-cd89-4c71-baa3-4a710faf816e","Type":"ContainerDied","Data":"4095e6d1c5297550191a60ab07a8112a56af5bc2a2db2c81c7c79f3620e6d9ab"} Oct 10 07:37:31 crc kubenswrapper[4732]: I1010 07:37:31.251327 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnppr" event={"ID":"249916e1-cd89-4c71-baa3-4a710faf816e","Type":"ContainerStarted","Data":"f7d8189a9c1eb246d5434ab0b32de0d03eb307cc229c04f00689b8ca4d8640ae"} Oct 10 07:37:31 crc kubenswrapper[4732]: I1010 07:37:31.272678 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hnppr" podStartSLOduration=2.768225825 podStartE2EDuration="5.272659102s" podCreationTimestamp="2025-10-10 07:37:26 +0000 UTC" firstStartedPulling="2025-10-10 07:37:28.223573381 +0000 UTC m=+2775.293164662" lastFinishedPulling="2025-10-10 07:37:30.728006698 +0000 UTC m=+2777.797597939" observedRunningTime="2025-10-10 07:37:31.266613607 +0000 UTC m=+2778.336204888" watchObservedRunningTime="2025-10-10 07:37:31.272659102 +0000 UTC m=+2778.342250343" Oct 10 07:37:36 crc kubenswrapper[4732]: I1010 07:37:36.697564 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:36 crc kubenswrapper[4732]: I1010 07:37:36.698125 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:36 crc kubenswrapper[4732]: I1010 07:37:36.749220 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:37 crc kubenswrapper[4732]: I1010 07:37:37.371201 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:37 crc kubenswrapper[4732]: I1010 07:37:37.435936 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnppr"] Oct 10 07:37:39 crc kubenswrapper[4732]: I1010 07:37:39.322152 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hnppr" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="registry-server" containerID="cri-o://f7d8189a9c1eb246d5434ab0b32de0d03eb307cc229c04f00689b8ca4d8640ae" gracePeriod=2 Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.332701 4732 generic.go:334] "Generic (PLEG): container finished" podID="249916e1-cd89-4c71-baa3-4a710faf816e" containerID="f7d8189a9c1eb246d5434ab0b32de0d03eb307cc229c04f00689b8ca4d8640ae" exitCode=0 Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.332767 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnppr" event={"ID":"249916e1-cd89-4c71-baa3-4a710faf816e","Type":"ContainerDied","Data":"f7d8189a9c1eb246d5434ab0b32de0d03eb307cc229c04f00689b8ca4d8640ae"} Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.333341 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hnppr" event={"ID":"249916e1-cd89-4c71-baa3-4a710faf816e","Type":"ContainerDied","Data":"2d6ab21b97bbea34942a93ae3ccaf97b0d94f12a8dce68276495e3284ee4ff36"} Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.333359 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6ab21b97bbea34942a93ae3ccaf97b0d94f12a8dce68276495e3284ee4ff36" Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.336364 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.440891 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-catalog-content\") pod \"249916e1-cd89-4c71-baa3-4a710faf816e\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.440970 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szw9q\" (UniqueName: \"kubernetes.io/projected/249916e1-cd89-4c71-baa3-4a710faf816e-kube-api-access-szw9q\") pod \"249916e1-cd89-4c71-baa3-4a710faf816e\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.440995 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-utilities\") pod \"249916e1-cd89-4c71-baa3-4a710faf816e\" (UID: \"249916e1-cd89-4c71-baa3-4a710faf816e\") " Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.442325 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-utilities" (OuterVolumeSpecName: "utilities") pod "249916e1-cd89-4c71-baa3-4a710faf816e" (UID: "249916e1-cd89-4c71-baa3-4a710faf816e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.452078 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249916e1-cd89-4c71-baa3-4a710faf816e-kube-api-access-szw9q" (OuterVolumeSpecName: "kube-api-access-szw9q") pod "249916e1-cd89-4c71-baa3-4a710faf816e" (UID: "249916e1-cd89-4c71-baa3-4a710faf816e"). InnerVolumeSpecName "kube-api-access-szw9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.542138 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szw9q\" (UniqueName: \"kubernetes.io/projected/249916e1-cd89-4c71-baa3-4a710faf816e-kube-api-access-szw9q\") on node \"crc\" DevicePath \"\"" Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.542187 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.913618 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "249916e1-cd89-4c71-baa3-4a710faf816e" (UID: "249916e1-cd89-4c71-baa3-4a710faf816e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:37:40 crc kubenswrapper[4732]: I1010 07:37:40.948545 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/249916e1-cd89-4c71-baa3-4a710faf816e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:37:41 crc kubenswrapper[4732]: I1010 07:37:41.340835 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hnppr" Oct 10 07:37:41 crc kubenswrapper[4732]: I1010 07:37:41.385241 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnppr"] Oct 10 07:37:41 crc kubenswrapper[4732]: I1010 07:37:41.392417 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hnppr"] Oct 10 07:37:41 crc kubenswrapper[4732]: I1010 07:37:41.669900 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" path="/var/lib/kubelet/pods/249916e1-cd89-4c71-baa3-4a710faf816e/volumes" Oct 10 07:37:43 crc kubenswrapper[4732]: I1010 07:37:43.991056 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kxjqg"] Oct 10 07:37:43 crc kubenswrapper[4732]: E1010 07:37:43.991736 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="extract-utilities" Oct 10 07:37:43 crc kubenswrapper[4732]: I1010 07:37:43.991752 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="extract-utilities" Oct 10 07:37:43 crc kubenswrapper[4732]: E1010 07:37:43.991776 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="registry-server" Oct 10 07:37:43 crc kubenswrapper[4732]: I1010 07:37:43.991784 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="registry-server" Oct 10 07:37:43 crc kubenswrapper[4732]: E1010 07:37:43.991804 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="extract-content" Oct 10 07:37:43 crc kubenswrapper[4732]: I1010 07:37:43.991811 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="extract-content" Oct 10 07:37:43 crc kubenswrapper[4732]: I1010 07:37:43.991985 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="249916e1-cd89-4c71-baa3-4a710faf816e" containerName="registry-server" Oct 10 07:37:43 crc kubenswrapper[4732]: I1010 07:37:43.993364 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.007903 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kxjqg"] Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.095211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-utilities\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.095887 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-catalog-content\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.096108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf2gh\" (UniqueName: \"kubernetes.io/projected/888d2b7c-6319-403e-b61f-02b5715fcb1d-kube-api-access-bf2gh\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.197532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-utilities\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.197637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-catalog-content\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.197676 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf2gh\" (UniqueName: \"kubernetes.io/projected/888d2b7c-6319-403e-b61f-02b5715fcb1d-kube-api-access-bf2gh\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.198130 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-utilities\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.198278 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-catalog-content\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.217419 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf2gh\" (UniqueName: \"kubernetes.io/projected/888d2b7c-6319-403e-b61f-02b5715fcb1d-kube-api-access-bf2gh\") pod \"certified-operators-kxjqg\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.328816 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:44 crc kubenswrapper[4732]: I1010 07:37:44.824263 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kxjqg"] Oct 10 07:37:45 crc kubenswrapper[4732]: I1010 07:37:45.378182 4732 generic.go:334] "Generic (PLEG): container finished" podID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerID="a533dbb6b11485577aed0d01bfaade5953add9ca71bf772454f3f2c798299d2b" exitCode=0 Oct 10 07:37:45 crc kubenswrapper[4732]: I1010 07:37:45.378227 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxjqg" event={"ID":"888d2b7c-6319-403e-b61f-02b5715fcb1d","Type":"ContainerDied","Data":"a533dbb6b11485577aed0d01bfaade5953add9ca71bf772454f3f2c798299d2b"} Oct 10 07:37:45 crc kubenswrapper[4732]: I1010 07:37:45.378254 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxjqg" event={"ID":"888d2b7c-6319-403e-b61f-02b5715fcb1d","Type":"ContainerStarted","Data":"95814b29287b16142fa40532047e66150f14e68e3fa811eb9847c9bf787533f4"} Oct 10 07:37:46 crc kubenswrapper[4732]: I1010 07:37:46.390874 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxjqg" event={"ID":"888d2b7c-6319-403e-b61f-02b5715fcb1d","Type":"ContainerStarted","Data":"016cdfdfacedb571b5c478a8a2d780f64244fc0841da1b3260cea6713089ae6c"} Oct 10 07:37:47 crc kubenswrapper[4732]: I1010 07:37:47.405408 4732 generic.go:334] "Generic (PLEG): container finished" podID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerID="016cdfdfacedb571b5c478a8a2d780f64244fc0841da1b3260cea6713089ae6c" exitCode=0 Oct 10 07:37:47 crc kubenswrapper[4732]: I1010 07:37:47.405485 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxjqg" event={"ID":"888d2b7c-6319-403e-b61f-02b5715fcb1d","Type":"ContainerDied","Data":"016cdfdfacedb571b5c478a8a2d780f64244fc0841da1b3260cea6713089ae6c"} Oct 10 07:37:48 crc kubenswrapper[4732]: I1010 07:37:48.420970 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxjqg" event={"ID":"888d2b7c-6319-403e-b61f-02b5715fcb1d","Type":"ContainerStarted","Data":"c04e0923b765b66fc290fa2c90713f2b0377d4c4c1fdd329215d3c07a16d84e8"} Oct 10 07:37:48 crc kubenswrapper[4732]: I1010 07:37:48.458174 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kxjqg" podStartSLOduration=2.902918246 podStartE2EDuration="5.458152572s" podCreationTimestamp="2025-10-10 07:37:43 +0000 UTC" firstStartedPulling="2025-10-10 07:37:45.381227432 +0000 UTC m=+2792.450818683" lastFinishedPulling="2025-10-10 07:37:47.936461768 +0000 UTC m=+2795.006053009" observedRunningTime="2025-10-10 07:37:48.45623153 +0000 UTC m=+2795.525822791" watchObservedRunningTime="2025-10-10 07:37:48.458152572 +0000 UTC m=+2795.527743813" Oct 10 07:37:54 crc kubenswrapper[4732]: I1010 07:37:54.329985 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:54 crc kubenswrapper[4732]: I1010 07:37:54.331066 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:54 crc kubenswrapper[4732]: I1010 07:37:54.410397 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:54 crc kubenswrapper[4732]: I1010 07:37:54.537785 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:54 crc kubenswrapper[4732]: I1010 07:37:54.655053 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kxjqg"] Oct 10 07:37:55 crc kubenswrapper[4732]: I1010 07:37:55.355434 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:37:55 crc kubenswrapper[4732]: I1010 07:37:55.355493 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:37:56 crc kubenswrapper[4732]: I1010 07:37:56.494885 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kxjqg" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="registry-server" containerID="cri-o://c04e0923b765b66fc290fa2c90713f2b0377d4c4c1fdd329215d3c07a16d84e8" gracePeriod=2 Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.509521 4732 generic.go:334] "Generic (PLEG): container finished" podID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerID="c04e0923b765b66fc290fa2c90713f2b0377d4c4c1fdd329215d3c07a16d84e8" exitCode=0 Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.509600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxjqg" event={"ID":"888d2b7c-6319-403e-b61f-02b5715fcb1d","Type":"ContainerDied","Data":"c04e0923b765b66fc290fa2c90713f2b0377d4c4c1fdd329215d3c07a16d84e8"} Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.704712 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.881623 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-utilities\") pod \"888d2b7c-6319-403e-b61f-02b5715fcb1d\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.881761 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-catalog-content\") pod \"888d2b7c-6319-403e-b61f-02b5715fcb1d\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.881889 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2gh\" (UniqueName: \"kubernetes.io/projected/888d2b7c-6319-403e-b61f-02b5715fcb1d-kube-api-access-bf2gh\") pod \"888d2b7c-6319-403e-b61f-02b5715fcb1d\" (UID: \"888d2b7c-6319-403e-b61f-02b5715fcb1d\") " Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.882766 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-utilities" (OuterVolumeSpecName: "utilities") pod "888d2b7c-6319-403e-b61f-02b5715fcb1d" (UID: "888d2b7c-6319-403e-b61f-02b5715fcb1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.891867 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/888d2b7c-6319-403e-b61f-02b5715fcb1d-kube-api-access-bf2gh" (OuterVolumeSpecName: "kube-api-access-bf2gh") pod "888d2b7c-6319-403e-b61f-02b5715fcb1d" (UID: "888d2b7c-6319-403e-b61f-02b5715fcb1d"). InnerVolumeSpecName "kube-api-access-bf2gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.983992 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:37:57 crc kubenswrapper[4732]: I1010 07:37:57.984051 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2gh\" (UniqueName: \"kubernetes.io/projected/888d2b7c-6319-403e-b61f-02b5715fcb1d-kube-api-access-bf2gh\") on node \"crc\" DevicePath \"\"" Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.136357 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "888d2b7c-6319-403e-b61f-02b5715fcb1d" (UID: "888d2b7c-6319-403e-b61f-02b5715fcb1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.186598 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/888d2b7c-6319-403e-b61f-02b5715fcb1d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.532082 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kxjqg" event={"ID":"888d2b7c-6319-403e-b61f-02b5715fcb1d","Type":"ContainerDied","Data":"95814b29287b16142fa40532047e66150f14e68e3fa811eb9847c9bf787533f4"} Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.532172 4732 scope.go:117] "RemoveContainer" containerID="c04e0923b765b66fc290fa2c90713f2b0377d4c4c1fdd329215d3c07a16d84e8" Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.532226 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kxjqg" Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.567751 4732 scope.go:117] "RemoveContainer" containerID="016cdfdfacedb571b5c478a8a2d780f64244fc0841da1b3260cea6713089ae6c" Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.604822 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kxjqg"] Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.609432 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kxjqg"] Oct 10 07:37:58 crc kubenswrapper[4732]: I1010 07:37:58.613117 4732 scope.go:117] "RemoveContainer" containerID="a533dbb6b11485577aed0d01bfaade5953add9ca71bf772454f3f2c798299d2b" Oct 10 07:37:59 crc kubenswrapper[4732]: I1010 07:37:59.673204 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" path="/var/lib/kubelet/pods/888d2b7c-6319-403e-b61f-02b5715fcb1d/volumes" Oct 10 07:38:25 crc kubenswrapper[4732]: I1010 07:38:25.355898 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:38:25 crc kubenswrapper[4732]: I1010 07:38:25.356679 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:38:55 crc kubenswrapper[4732]: I1010 07:38:55.356305 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:38:55 crc kubenswrapper[4732]: I1010 07:38:55.356987 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:38:55 crc kubenswrapper[4732]: I1010 07:38:55.357063 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:38:55 crc kubenswrapper[4732]: I1010 07:38:55.357944 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:38:55 crc kubenswrapper[4732]: I1010 07:38:55.358027 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" gracePeriod=600 Oct 10 07:38:55 crc kubenswrapper[4732]: E1010 07:38:55.513295 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:38:56 crc kubenswrapper[4732]: I1010 07:38:56.086806 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" exitCode=0 Oct 10 07:38:56 crc kubenswrapper[4732]: I1010 07:38:56.086859 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c"} Oct 10 07:38:56 crc kubenswrapper[4732]: I1010 07:38:56.086895 4732 scope.go:117] "RemoveContainer" containerID="94813d2c17f53a9748a0e1e81aaa6f40247e7b25f2ff0afbf59ef90707bf81f1" Oct 10 07:38:56 crc kubenswrapper[4732]: I1010 07:38:56.087567 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:38:56 crc kubenswrapper[4732]: E1010 07:38:56.087945 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:39:10 crc kubenswrapper[4732]: I1010 07:39:10.660489 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:39:10 crc kubenswrapper[4732]: E1010 07:39:10.661512 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:39:23 crc kubenswrapper[4732]: I1010 07:39:23.667360 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:39:23 crc kubenswrapper[4732]: E1010 07:39:23.668624 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:39:36 crc kubenswrapper[4732]: I1010 07:39:36.660211 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:39:36 crc kubenswrapper[4732]: E1010 07:39:36.661051 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:39:47 crc kubenswrapper[4732]: I1010 07:39:47.660525 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:39:47 crc kubenswrapper[4732]: E1010 07:39:47.661558 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:40:00 crc kubenswrapper[4732]: I1010 07:40:00.660852 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:40:00 crc kubenswrapper[4732]: E1010 07:40:00.662089 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:40:11 crc kubenswrapper[4732]: I1010 07:40:11.660564 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:40:11 crc kubenswrapper[4732]: E1010 07:40:11.661547 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:40:26 crc kubenswrapper[4732]: I1010 07:40:26.660847 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:40:26 crc kubenswrapper[4732]: E1010 07:40:26.661634 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:40:37 crc kubenswrapper[4732]: I1010 07:40:37.661907 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:40:37 crc kubenswrapper[4732]: E1010 07:40:37.662654 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:40:48 crc kubenswrapper[4732]: I1010 07:40:48.660940 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:40:48 crc kubenswrapper[4732]: E1010 07:40:48.662088 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:41:00 crc kubenswrapper[4732]: I1010 07:41:00.660335 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:41:00 crc kubenswrapper[4732]: E1010 07:41:00.661487 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:41:13 crc kubenswrapper[4732]: I1010 07:41:13.665521 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:41:13 crc kubenswrapper[4732]: E1010 07:41:13.666742 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:41:28 crc kubenswrapper[4732]: I1010 07:41:28.659723 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:41:28 crc kubenswrapper[4732]: E1010 07:41:28.660389 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:41:41 crc kubenswrapper[4732]: I1010 07:41:41.660180 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:41:41 crc kubenswrapper[4732]: E1010 07:41:41.661015 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:41:53 crc kubenswrapper[4732]: I1010 07:41:53.664283 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:41:53 crc kubenswrapper[4732]: E1010 07:41:53.665305 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:42:08 crc kubenswrapper[4732]: I1010 07:42:08.660673 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:42:08 crc kubenswrapper[4732]: E1010 07:42:08.661719 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:42:09 crc kubenswrapper[4732]: I1010 07:42:09.994232 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sphbb"] Oct 10 07:42:09 crc kubenswrapper[4732]: E1010 07:42:09.994600 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="registry-server" Oct 10 07:42:09 crc kubenswrapper[4732]: I1010 07:42:09.994619 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="registry-server" Oct 10 07:42:09 crc kubenswrapper[4732]: E1010 07:42:09.994647 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="extract-content" Oct 10 07:42:09 crc kubenswrapper[4732]: I1010 07:42:09.994658 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="extract-content" Oct 10 07:42:09 crc kubenswrapper[4732]: E1010 07:42:09.994728 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="extract-utilities" Oct 10 07:42:09 crc kubenswrapper[4732]: I1010 07:42:09.994740 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="extract-utilities" Oct 10 07:42:09 crc kubenswrapper[4732]: I1010 07:42:09.994970 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="888d2b7c-6319-403e-b61f-02b5715fcb1d" containerName="registry-server" Oct 10 07:42:09 crc kubenswrapper[4732]: I1010 07:42:09.996537 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.010430 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sphbb"] Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.144232 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-utilities\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.144303 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z697l\" (UniqueName: \"kubernetes.io/projected/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-kube-api-access-z697l\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.144378 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-catalog-content\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.245488 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-catalog-content\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.245640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-utilities\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.245712 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z697l\" (UniqueName: \"kubernetes.io/projected/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-kube-api-access-z697l\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.246329 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-catalog-content\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.246416 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-utilities\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.276590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z697l\" (UniqueName: \"kubernetes.io/projected/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-kube-api-access-z697l\") pod \"community-operators-sphbb\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.324209 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.556988 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sphbb"] Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.915635 4732 generic.go:334] "Generic (PLEG): container finished" podID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerID="41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a" exitCode=0 Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.915759 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sphbb" event={"ID":"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a","Type":"ContainerDied","Data":"41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a"} Oct 10 07:42:10 crc kubenswrapper[4732]: I1010 07:42:10.916039 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sphbb" event={"ID":"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a","Type":"ContainerStarted","Data":"0f75510c90d61ece464dfb548ad984ec5fb238d580646fc98ad1ea90b77d9ae6"} Oct 10 07:42:11 crc kubenswrapper[4732]: I1010 07:42:11.926670 4732 generic.go:334] "Generic (PLEG): container finished" podID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerID="98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8" exitCode=0 Oct 10 07:42:11 crc kubenswrapper[4732]: I1010 07:42:11.927089 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sphbb" event={"ID":"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a","Type":"ContainerDied","Data":"98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8"} Oct 10 07:42:12 crc kubenswrapper[4732]: I1010 07:42:12.935822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sphbb" event={"ID":"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a","Type":"ContainerStarted","Data":"9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223"} Oct 10 07:42:12 crc kubenswrapper[4732]: I1010 07:42:12.962496 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sphbb" podStartSLOduration=2.543768216 podStartE2EDuration="3.962468774s" podCreationTimestamp="2025-10-10 07:42:09 +0000 UTC" firstStartedPulling="2025-10-10 07:42:10.917672699 +0000 UTC m=+3057.987263940" lastFinishedPulling="2025-10-10 07:42:12.336373267 +0000 UTC m=+3059.405964498" observedRunningTime="2025-10-10 07:42:12.958538018 +0000 UTC m=+3060.028129279" watchObservedRunningTime="2025-10-10 07:42:12.962468774 +0000 UTC m=+3060.032060025" Oct 10 07:42:20 crc kubenswrapper[4732]: I1010 07:42:20.324553 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:20 crc kubenswrapper[4732]: I1010 07:42:20.325403 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:20 crc kubenswrapper[4732]: I1010 07:42:20.405280 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:21 crc kubenswrapper[4732]: I1010 07:42:21.099552 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:21 crc kubenswrapper[4732]: I1010 07:42:21.171085 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sphbb"] Oct 10 07:42:21 crc kubenswrapper[4732]: I1010 07:42:21.660550 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:42:21 crc kubenswrapper[4732]: E1010 07:42:21.660980 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:42:23 crc kubenswrapper[4732]: I1010 07:42:23.043394 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sphbb" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="registry-server" containerID="cri-o://9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223" gracePeriod=2 Oct 10 07:42:23 crc kubenswrapper[4732]: I1010 07:42:23.965252 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.052929 4732 generic.go:334] "Generic (PLEG): container finished" podID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerID="9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223" exitCode=0 Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.052985 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sphbb" event={"ID":"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a","Type":"ContainerDied","Data":"9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223"} Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.053022 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sphbb" event={"ID":"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a","Type":"ContainerDied","Data":"0f75510c90d61ece464dfb548ad984ec5fb238d580646fc98ad1ea90b77d9ae6"} Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.053035 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sphbb" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.053045 4732 scope.go:117] "RemoveContainer" containerID="9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.065736 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-utilities\") pod \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.065852 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z697l\" (UniqueName: \"kubernetes.io/projected/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-kube-api-access-z697l\") pod \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.065903 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-catalog-content\") pod \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\" (UID: \"bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a\") " Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.066861 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-utilities" (OuterVolumeSpecName: "utilities") pod "bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" (UID: "bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.072881 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-kube-api-access-z697l" (OuterVolumeSpecName: "kube-api-access-z697l") pod "bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" (UID: "bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a"). InnerVolumeSpecName "kube-api-access-z697l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.076899 4732 scope.go:117] "RemoveContainer" containerID="98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.104741 4732 scope.go:117] "RemoveContainer" containerID="41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.124482 4732 scope.go:117] "RemoveContainer" containerID="9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223" Oct 10 07:42:24 crc kubenswrapper[4732]: E1010 07:42:24.124925 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223\": container with ID starting with 9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223 not found: ID does not exist" containerID="9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.124994 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223"} err="failed to get container status \"9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223\": rpc error: code = NotFound desc = could not find container \"9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223\": container with ID starting with 9ed62fb7d69e03c6e0e00912c0a1a78f8433ba0b7591aa7aa6ad048acdc5d223 not found: ID does not exist" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.125029 4732 scope.go:117] "RemoveContainer" containerID="98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8" Oct 10 07:42:24 crc kubenswrapper[4732]: E1010 07:42:24.125547 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8\": container with ID starting with 98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8 not found: ID does not exist" containerID="98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.125618 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8"} err="failed to get container status \"98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8\": rpc error: code = NotFound desc = could not find container \"98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8\": container with ID starting with 98b95db7fe0b4e9ffe14c0d81b9b3c634f22936aaafa15af9acad6e4408133d8 not found: ID does not exist" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.125643 4732 scope.go:117] "RemoveContainer" containerID="41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a" Oct 10 07:42:24 crc kubenswrapper[4732]: E1010 07:42:24.125951 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a\": container with ID starting with 41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a not found: ID does not exist" containerID="41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.125990 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a"} err="failed to get container status \"41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a\": rpc error: code = NotFound desc = could not find container \"41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a\": container with ID starting with 41d21e7b87645b35a8e407cc4a8f096e6b02660a088bfd564c350d761414233a not found: ID does not exist" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.130635 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" (UID: "bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.167139 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.167197 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.167215 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z697l\" (UniqueName: \"kubernetes.io/projected/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a-kube-api-access-z697l\") on node \"crc\" DevicePath \"\"" Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.402420 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sphbb"] Oct 10 07:42:24 crc kubenswrapper[4732]: I1010 07:42:24.407953 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sphbb"] Oct 10 07:42:25 crc kubenswrapper[4732]: I1010 07:42:25.674149 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" path="/var/lib/kubelet/pods/bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a/volumes" Oct 10 07:42:35 crc kubenswrapper[4732]: I1010 07:42:35.660386 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:42:35 crc kubenswrapper[4732]: E1010 07:42:35.661444 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:42:50 crc kubenswrapper[4732]: I1010 07:42:50.660475 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:42:50 crc kubenswrapper[4732]: E1010 07:42:50.661299 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:43:04 crc kubenswrapper[4732]: I1010 07:43:04.660797 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:43:04 crc kubenswrapper[4732]: E1010 07:43:04.661911 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:43:16 crc kubenswrapper[4732]: I1010 07:43:16.660403 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:43:16 crc kubenswrapper[4732]: E1010 07:43:16.661032 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:43:28 crc kubenswrapper[4732]: I1010 07:43:28.661341 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:43:28 crc kubenswrapper[4732]: E1010 07:43:28.662351 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:43:43 crc kubenswrapper[4732]: I1010 07:43:43.669896 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:43:43 crc kubenswrapper[4732]: E1010 07:43:43.670971 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:43:56 crc kubenswrapper[4732]: I1010 07:43:56.660497 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:43:56 crc kubenswrapper[4732]: I1010 07:43:56.991426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"08c62da30f16093e913de96793a90e2b44521000f060ca70d35236045d5b3004"} Oct 10 07:44:23 crc kubenswrapper[4732]: I1010 07:44:23.728360 4732 scope.go:117] "RemoveContainer" containerID="4095e6d1c5297550191a60ab07a8112a56af5bc2a2db2c81c7c79f3620e6d9ab" Oct 10 07:44:23 crc kubenswrapper[4732]: I1010 07:44:23.764244 4732 scope.go:117] "RemoveContainer" containerID="f7d8189a9c1eb246d5434ab0b32de0d03eb307cc229c04f00689b8ca4d8640ae" Oct 10 07:44:23 crc kubenswrapper[4732]: I1010 07:44:23.825813 4732 scope.go:117] "RemoveContainer" containerID="37182ff6f39452867f977ae80d66a9940f74fd30c4fae9e3ea91b5a25e088765" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.208767 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4"] Oct 10 07:45:00 crc kubenswrapper[4732]: E1010 07:45:00.210291 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="extract-utilities" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.210311 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="extract-utilities" Oct 10 07:45:00 crc kubenswrapper[4732]: E1010 07:45:00.210351 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="registry-server" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.210359 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="registry-server" Oct 10 07:45:00 crc kubenswrapper[4732]: E1010 07:45:00.210387 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="extract-content" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.210397 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="extract-content" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.210948 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4e124f-ba1f-43e5-8ad1-8eebd12a5f9a" containerName="registry-server" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.212008 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.217173 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.218289 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.249180 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4"] Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.341866 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b467f54-105b-43d2-ac29-0d3e6cfd993d-config-volume\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.342526 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b467f54-105b-43d2-ac29-0d3e6cfd993d-secret-volume\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.342862 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wd7s\" (UniqueName: \"kubernetes.io/projected/4b467f54-105b-43d2-ac29-0d3e6cfd993d-kube-api-access-4wd7s\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.444764 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b467f54-105b-43d2-ac29-0d3e6cfd993d-config-volume\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.444832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b467f54-105b-43d2-ac29-0d3e6cfd993d-secret-volume\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.444949 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wd7s\" (UniqueName: \"kubernetes.io/projected/4b467f54-105b-43d2-ac29-0d3e6cfd993d-kube-api-access-4wd7s\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.446484 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b467f54-105b-43d2-ac29-0d3e6cfd993d-config-volume\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.451034 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b467f54-105b-43d2-ac29-0d3e6cfd993d-secret-volume\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.467971 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wd7s\" (UniqueName: \"kubernetes.io/projected/4b467f54-105b-43d2-ac29-0d3e6cfd993d-kube-api-access-4wd7s\") pod \"collect-profiles-29334705-6vwz4\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:00 crc kubenswrapper[4732]: I1010 07:45:00.560182 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:01 crc kubenswrapper[4732]: I1010 07:45:01.054968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4"] Oct 10 07:45:01 crc kubenswrapper[4732]: I1010 07:45:01.614919 4732 generic.go:334] "Generic (PLEG): container finished" podID="4b467f54-105b-43d2-ac29-0d3e6cfd993d" containerID="933636b52d682f8679c3a3929f2a945ad2993ab5044f7c2e8fabfeffabe91111" exitCode=0 Oct 10 07:45:01 crc kubenswrapper[4732]: I1010 07:45:01.615023 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" event={"ID":"4b467f54-105b-43d2-ac29-0d3e6cfd993d","Type":"ContainerDied","Data":"933636b52d682f8679c3a3929f2a945ad2993ab5044f7c2e8fabfeffabe91111"} Oct 10 07:45:01 crc kubenswrapper[4732]: I1010 07:45:01.615317 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" event={"ID":"4b467f54-105b-43d2-ac29-0d3e6cfd993d","Type":"ContainerStarted","Data":"8c317481425b5000f51d8d62a04a3fc45c8e6d9cf29e1657c46a939b6fe4bfaa"} Oct 10 07:45:02 crc kubenswrapper[4732]: I1010 07:45:02.986810 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.085860 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wd7s\" (UniqueName: \"kubernetes.io/projected/4b467f54-105b-43d2-ac29-0d3e6cfd993d-kube-api-access-4wd7s\") pod \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.085911 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b467f54-105b-43d2-ac29-0d3e6cfd993d-config-volume\") pod \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.085959 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b467f54-105b-43d2-ac29-0d3e6cfd993d-secret-volume\") pod \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\" (UID: \"4b467f54-105b-43d2-ac29-0d3e6cfd993d\") " Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.086975 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b467f54-105b-43d2-ac29-0d3e6cfd993d-config-volume" (OuterVolumeSpecName: "config-volume") pod "4b467f54-105b-43d2-ac29-0d3e6cfd993d" (UID: "4b467f54-105b-43d2-ac29-0d3e6cfd993d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.093037 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b467f54-105b-43d2-ac29-0d3e6cfd993d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4b467f54-105b-43d2-ac29-0d3e6cfd993d" (UID: "4b467f54-105b-43d2-ac29-0d3e6cfd993d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.096873 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b467f54-105b-43d2-ac29-0d3e6cfd993d-kube-api-access-4wd7s" (OuterVolumeSpecName: "kube-api-access-4wd7s") pod "4b467f54-105b-43d2-ac29-0d3e6cfd993d" (UID: "4b467f54-105b-43d2-ac29-0d3e6cfd993d"). InnerVolumeSpecName "kube-api-access-4wd7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.187417 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wd7s\" (UniqueName: \"kubernetes.io/projected/4b467f54-105b-43d2-ac29-0d3e6cfd993d-kube-api-access-4wd7s\") on node \"crc\" DevicePath \"\"" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.187456 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b467f54-105b-43d2-ac29-0d3e6cfd993d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.187468 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b467f54-105b-43d2-ac29-0d3e6cfd993d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.637057 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" event={"ID":"4b467f54-105b-43d2-ac29-0d3e6cfd993d","Type":"ContainerDied","Data":"8c317481425b5000f51d8d62a04a3fc45c8e6d9cf29e1657c46a939b6fe4bfaa"} Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.637107 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c317481425b5000f51d8d62a04a3fc45c8e6d9cf29e1657c46a939b6fe4bfaa" Oct 10 07:45:03 crc kubenswrapper[4732]: I1010 07:45:03.637198 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4" Oct 10 07:45:04 crc kubenswrapper[4732]: I1010 07:45:04.087053 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl"] Oct 10 07:45:04 crc kubenswrapper[4732]: I1010 07:45:04.092148 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334660-wr6cl"] Oct 10 07:45:05 crc kubenswrapper[4732]: I1010 07:45:05.673262 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75983d33-55cf-4310-853f-d3e2b7fefbe3" path="/var/lib/kubelet/pods/75983d33-55cf-4310-853f-d3e2b7fefbe3/volumes" Oct 10 07:45:23 crc kubenswrapper[4732]: I1010 07:45:23.907671 4732 scope.go:117] "RemoveContainer" containerID="034b569f27e9cacb24405c7c443f6d0c37905d2a951ea730a3544339e0ce7a93" Oct 10 07:46:25 crc kubenswrapper[4732]: I1010 07:46:25.356865 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:46:25 crc kubenswrapper[4732]: I1010 07:46:25.357887 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:46:55 crc kubenswrapper[4732]: I1010 07:46:55.356118 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:46:55 crc kubenswrapper[4732]: I1010 07:46:55.356884 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.355465 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.356002 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.356058 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.356725 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08c62da30f16093e913de96793a90e2b44521000f060ca70d35236045d5b3004"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.356786 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://08c62da30f16093e913de96793a90e2b44521000f060ca70d35236045d5b3004" gracePeriod=600 Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.990679 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="08c62da30f16093e913de96793a90e2b44521000f060ca70d35236045d5b3004" exitCode=0 Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.990766 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"08c62da30f16093e913de96793a90e2b44521000f060ca70d35236045d5b3004"} Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.991084 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2"} Oct 10 07:47:25 crc kubenswrapper[4732]: I1010 07:47:25.991119 4732 scope.go:117] "RemoveContainer" containerID="fd49a6f107cc03dc9b8f95a06fc14761bdd7a0e9ef9fdf8a3cbdec0f00f6b17c" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.552162 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2q2db"] Oct 10 07:47:39 crc kubenswrapper[4732]: E1010 07:47:39.556151 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b467f54-105b-43d2-ac29-0d3e6cfd993d" containerName="collect-profiles" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.556190 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b467f54-105b-43d2-ac29-0d3e6cfd993d" containerName="collect-profiles" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.556384 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b467f54-105b-43d2-ac29-0d3e6cfd993d" containerName="collect-profiles" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.557568 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.579104 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q2db"] Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.715927 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-catalog-content\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.715990 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-utilities\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.716025 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtx55\" (UniqueName: \"kubernetes.io/projected/a967363f-c0c8-448c-8290-0e756adcb9b7-kube-api-access-jtx55\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.817131 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-utilities\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.817199 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtx55\" (UniqueName: \"kubernetes.io/projected/a967363f-c0c8-448c-8290-0e756adcb9b7-kube-api-access-jtx55\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.817320 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-catalog-content\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.817905 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-catalog-content\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.818561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-utilities\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.863716 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtx55\" (UniqueName: \"kubernetes.io/projected/a967363f-c0c8-448c-8290-0e756adcb9b7-kube-api-access-jtx55\") pod \"redhat-marketplace-2q2db\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:39 crc kubenswrapper[4732]: I1010 07:47:39.878193 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:40 crc kubenswrapper[4732]: I1010 07:47:40.329321 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q2db"] Oct 10 07:47:41 crc kubenswrapper[4732]: I1010 07:47:41.119318 4732 generic.go:334] "Generic (PLEG): container finished" podID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerID="fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065" exitCode=0 Oct 10 07:47:41 crc kubenswrapper[4732]: I1010 07:47:41.119391 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q2db" event={"ID":"a967363f-c0c8-448c-8290-0e756adcb9b7","Type":"ContainerDied","Data":"fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065"} Oct 10 07:47:41 crc kubenswrapper[4732]: I1010 07:47:41.119458 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q2db" event={"ID":"a967363f-c0c8-448c-8290-0e756adcb9b7","Type":"ContainerStarted","Data":"162d478b0eaca2401ce315ebaaa20ebce9cffda1f6120ed9df806312ffd699d6"} Oct 10 07:47:41 crc kubenswrapper[4732]: I1010 07:47:41.122495 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:47:43 crc kubenswrapper[4732]: I1010 07:47:43.139341 4732 generic.go:334] "Generic (PLEG): container finished" podID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerID="4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a" exitCode=0 Oct 10 07:47:43 crc kubenswrapper[4732]: I1010 07:47:43.139586 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q2db" event={"ID":"a967363f-c0c8-448c-8290-0e756adcb9b7","Type":"ContainerDied","Data":"4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a"} Oct 10 07:47:44 crc kubenswrapper[4732]: I1010 07:47:44.149746 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q2db" event={"ID":"a967363f-c0c8-448c-8290-0e756adcb9b7","Type":"ContainerStarted","Data":"80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd"} Oct 10 07:47:44 crc kubenswrapper[4732]: I1010 07:47:44.178284 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2q2db" podStartSLOduration=2.6150248400000002 podStartE2EDuration="5.178259203s" podCreationTimestamp="2025-10-10 07:47:39 +0000 UTC" firstStartedPulling="2025-10-10 07:47:41.122005982 +0000 UTC m=+3388.191597263" lastFinishedPulling="2025-10-10 07:47:43.685240365 +0000 UTC m=+3390.754831626" observedRunningTime="2025-10-10 07:47:44.170285807 +0000 UTC m=+3391.239877108" watchObservedRunningTime="2025-10-10 07:47:44.178259203 +0000 UTC m=+3391.247850484" Oct 10 07:47:49 crc kubenswrapper[4732]: I1010 07:47:49.878759 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:49 crc kubenswrapper[4732]: I1010 07:47:49.879143 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:49 crc kubenswrapper[4732]: I1010 07:47:49.952439 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:50 crc kubenswrapper[4732]: I1010 07:47:50.238301 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:50 crc kubenswrapper[4732]: I1010 07:47:50.287482 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q2db"] Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.217483 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2q2db" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="registry-server" containerID="cri-o://80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd" gracePeriod=2 Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.691084 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.813036 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtx55\" (UniqueName: \"kubernetes.io/projected/a967363f-c0c8-448c-8290-0e756adcb9b7-kube-api-access-jtx55\") pod \"a967363f-c0c8-448c-8290-0e756adcb9b7\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.813199 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-utilities\") pod \"a967363f-c0c8-448c-8290-0e756adcb9b7\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.813242 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-catalog-content\") pod \"a967363f-c0c8-448c-8290-0e756adcb9b7\" (UID: \"a967363f-c0c8-448c-8290-0e756adcb9b7\") " Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.814744 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-utilities" (OuterVolumeSpecName: "utilities") pod "a967363f-c0c8-448c-8290-0e756adcb9b7" (UID: "a967363f-c0c8-448c-8290-0e756adcb9b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.817647 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a967363f-c0c8-448c-8290-0e756adcb9b7-kube-api-access-jtx55" (OuterVolumeSpecName: "kube-api-access-jtx55") pod "a967363f-c0c8-448c-8290-0e756adcb9b7" (UID: "a967363f-c0c8-448c-8290-0e756adcb9b7"). InnerVolumeSpecName "kube-api-access-jtx55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.827637 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a967363f-c0c8-448c-8290-0e756adcb9b7" (UID: "a967363f-c0c8-448c-8290-0e756adcb9b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.915109 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.915145 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a967363f-c0c8-448c-8290-0e756adcb9b7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:47:52 crc kubenswrapper[4732]: I1010 07:47:52.915158 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtx55\" (UniqueName: \"kubernetes.io/projected/a967363f-c0c8-448c-8290-0e756adcb9b7-kube-api-access-jtx55\") on node \"crc\" DevicePath \"\"" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.225759 4732 generic.go:334] "Generic (PLEG): container finished" podID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerID="80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd" exitCode=0 Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.225808 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q2db" event={"ID":"a967363f-c0c8-448c-8290-0e756adcb9b7","Type":"ContainerDied","Data":"80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd"} Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.225857 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2q2db" event={"ID":"a967363f-c0c8-448c-8290-0e756adcb9b7","Type":"ContainerDied","Data":"162d478b0eaca2401ce315ebaaa20ebce9cffda1f6120ed9df806312ffd699d6"} Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.225875 4732 scope.go:117] "RemoveContainer" containerID="80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.225831 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2q2db" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.244211 4732 scope.go:117] "RemoveContainer" containerID="4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.262529 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q2db"] Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.264540 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2q2db"] Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.274500 4732 scope.go:117] "RemoveContainer" containerID="fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.295524 4732 scope.go:117] "RemoveContainer" containerID="80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd" Oct 10 07:47:53 crc kubenswrapper[4732]: E1010 07:47:53.296012 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd\": container with ID starting with 80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd not found: ID does not exist" containerID="80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.296050 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd"} err="failed to get container status \"80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd\": rpc error: code = NotFound desc = could not find container \"80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd\": container with ID starting with 80c5def5c4634027680d75335f0244b2bbe8e838044c3f8e08d9e2e2c8ff1dbd not found: ID does not exist" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.296074 4732 scope.go:117] "RemoveContainer" containerID="4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a" Oct 10 07:47:53 crc kubenswrapper[4732]: E1010 07:47:53.296457 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a\": container with ID starting with 4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a not found: ID does not exist" containerID="4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.296488 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a"} err="failed to get container status \"4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a\": rpc error: code = NotFound desc = could not find container \"4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a\": container with ID starting with 4379e319d77a9119416ae45ad13252f84ac66ad8fb5821d8c131ba821e06502a not found: ID does not exist" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.296510 4732 scope.go:117] "RemoveContainer" containerID="fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065" Oct 10 07:47:53 crc kubenswrapper[4732]: E1010 07:47:53.296813 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065\": container with ID starting with fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065 not found: ID does not exist" containerID="fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.296860 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065"} err="failed to get container status \"fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065\": rpc error: code = NotFound desc = could not find container \"fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065\": container with ID starting with fe2686fa7dfd1d143fa701e059f02e229d72e7f53bd245c5bced1a0943930065 not found: ID does not exist" Oct 10 07:47:53 crc kubenswrapper[4732]: I1010 07:47:53.674001 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" path="/var/lib/kubelet/pods/a967363f-c0c8-448c-8290-0e756adcb9b7/volumes" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.069291 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sqc9t"] Oct 10 07:48:37 crc kubenswrapper[4732]: E1010 07:48:37.070114 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="registry-server" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.070129 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="registry-server" Oct 10 07:48:37 crc kubenswrapper[4732]: E1010 07:48:37.070139 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="extract-utilities" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.070148 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="extract-utilities" Oct 10 07:48:37 crc kubenswrapper[4732]: E1010 07:48:37.070181 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="extract-content" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.070190 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="extract-content" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.070375 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a967363f-c0c8-448c-8290-0e756adcb9b7" containerName="registry-server" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.071621 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.085725 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqc9t"] Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.249562 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-catalog-content\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.249620 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvc28\" (UniqueName: \"kubernetes.io/projected/b622449c-0212-41d8-b378-3f81937ca846-kube-api-access-dvc28\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.249720 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-utilities\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.350960 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-utilities\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.351044 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-catalog-content\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.351073 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvc28\" (UniqueName: \"kubernetes.io/projected/b622449c-0212-41d8-b378-3f81937ca846-kube-api-access-dvc28\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.351971 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-catalog-content\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.352393 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-utilities\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.373814 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvc28\" (UniqueName: \"kubernetes.io/projected/b622449c-0212-41d8-b378-3f81937ca846-kube-api-access-dvc28\") pod \"certified-operators-sqc9t\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.394014 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:37 crc kubenswrapper[4732]: I1010 07:48:37.721965 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sqc9t"] Oct 10 07:48:38 crc kubenswrapper[4732]: I1010 07:48:38.663130 4732 generic.go:334] "Generic (PLEG): container finished" podID="b622449c-0212-41d8-b378-3f81937ca846" containerID="1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e" exitCode=0 Oct 10 07:48:38 crc kubenswrapper[4732]: I1010 07:48:38.663201 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqc9t" event={"ID":"b622449c-0212-41d8-b378-3f81937ca846","Type":"ContainerDied","Data":"1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e"} Oct 10 07:48:38 crc kubenswrapper[4732]: I1010 07:48:38.663243 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqc9t" event={"ID":"b622449c-0212-41d8-b378-3f81937ca846","Type":"ContainerStarted","Data":"98a8c2ad0962e9c85be83be7f17cc9533e184a4641bed105c2f3443d9aded8e3"} Oct 10 07:48:39 crc kubenswrapper[4732]: I1010 07:48:39.683657 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqc9t" event={"ID":"b622449c-0212-41d8-b378-3f81937ca846","Type":"ContainerStarted","Data":"02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c"} Oct 10 07:48:40 crc kubenswrapper[4732]: I1010 07:48:40.696182 4732 generic.go:334] "Generic (PLEG): container finished" podID="b622449c-0212-41d8-b378-3f81937ca846" containerID="02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c" exitCode=0 Oct 10 07:48:40 crc kubenswrapper[4732]: I1010 07:48:40.696324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqc9t" event={"ID":"b622449c-0212-41d8-b378-3f81937ca846","Type":"ContainerDied","Data":"02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c"} Oct 10 07:48:41 crc kubenswrapper[4732]: I1010 07:48:41.712529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqc9t" event={"ID":"b622449c-0212-41d8-b378-3f81937ca846","Type":"ContainerStarted","Data":"776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e"} Oct 10 07:48:41 crc kubenswrapper[4732]: I1010 07:48:41.746515 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sqc9t" podStartSLOduration=2.280472808 podStartE2EDuration="4.746478424s" podCreationTimestamp="2025-10-10 07:48:37 +0000 UTC" firstStartedPulling="2025-10-10 07:48:38.665927724 +0000 UTC m=+3445.735518975" lastFinishedPulling="2025-10-10 07:48:41.13193335 +0000 UTC m=+3448.201524591" observedRunningTime="2025-10-10 07:48:41.743419191 +0000 UTC m=+3448.813010492" watchObservedRunningTime="2025-10-10 07:48:41.746478424 +0000 UTC m=+3448.816069735" Oct 10 07:48:47 crc kubenswrapper[4732]: I1010 07:48:47.395173 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:47 crc kubenswrapper[4732]: I1010 07:48:47.395927 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:47 crc kubenswrapper[4732]: I1010 07:48:47.474637 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:47 crc kubenswrapper[4732]: I1010 07:48:47.841793 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:47 crc kubenswrapper[4732]: I1010 07:48:47.901434 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqc9t"] Oct 10 07:48:49 crc kubenswrapper[4732]: I1010 07:48:49.782328 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sqc9t" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="registry-server" containerID="cri-o://776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e" gracePeriod=2 Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.343360 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.383298 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvc28\" (UniqueName: \"kubernetes.io/projected/b622449c-0212-41d8-b378-3f81937ca846-kube-api-access-dvc28\") pod \"b622449c-0212-41d8-b378-3f81937ca846\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.383350 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-utilities\") pod \"b622449c-0212-41d8-b378-3f81937ca846\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.383419 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-catalog-content\") pod \"b622449c-0212-41d8-b378-3f81937ca846\" (UID: \"b622449c-0212-41d8-b378-3f81937ca846\") " Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.384311 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-utilities" (OuterVolumeSpecName: "utilities") pod "b622449c-0212-41d8-b378-3f81937ca846" (UID: "b622449c-0212-41d8-b378-3f81937ca846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.392763 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b622449c-0212-41d8-b378-3f81937ca846-kube-api-access-dvc28" (OuterVolumeSpecName: "kube-api-access-dvc28") pod "b622449c-0212-41d8-b378-3f81937ca846" (UID: "b622449c-0212-41d8-b378-3f81937ca846"). InnerVolumeSpecName "kube-api-access-dvc28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.463652 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b622449c-0212-41d8-b378-3f81937ca846" (UID: "b622449c-0212-41d8-b378-3f81937ca846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.485193 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.485226 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b622449c-0212-41d8-b378-3f81937ca846-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.485239 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvc28\" (UniqueName: \"kubernetes.io/projected/b622449c-0212-41d8-b378-3f81937ca846-kube-api-access-dvc28\") on node \"crc\" DevicePath \"\"" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.792116 4732 generic.go:334] "Generic (PLEG): container finished" podID="b622449c-0212-41d8-b378-3f81937ca846" containerID="776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e" exitCode=0 Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.792154 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqc9t" event={"ID":"b622449c-0212-41d8-b378-3f81937ca846","Type":"ContainerDied","Data":"776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e"} Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.792176 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sqc9t" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.792193 4732 scope.go:117] "RemoveContainer" containerID="776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.792184 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sqc9t" event={"ID":"b622449c-0212-41d8-b378-3f81937ca846","Type":"ContainerDied","Data":"98a8c2ad0962e9c85be83be7f17cc9533e184a4641bed105c2f3443d9aded8e3"} Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.808851 4732 scope.go:117] "RemoveContainer" containerID="02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.822547 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sqc9t"] Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.839616 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sqc9t"] Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.844342 4732 scope.go:117] "RemoveContainer" containerID="1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.860346 4732 scope.go:117] "RemoveContainer" containerID="776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e" Oct 10 07:48:50 crc kubenswrapper[4732]: E1010 07:48:50.860722 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e\": container with ID starting with 776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e not found: ID does not exist" containerID="776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.860749 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e"} err="failed to get container status \"776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e\": rpc error: code = NotFound desc = could not find container \"776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e\": container with ID starting with 776eb432cfbe08a9c8cac08b198268705a90673948142eb1f6dad41c3df3533e not found: ID does not exist" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.860767 4732 scope.go:117] "RemoveContainer" containerID="02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c" Oct 10 07:48:50 crc kubenswrapper[4732]: E1010 07:48:50.860987 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c\": container with ID starting with 02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c not found: ID does not exist" containerID="02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.861019 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c"} err="failed to get container status \"02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c\": rpc error: code = NotFound desc = could not find container \"02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c\": container with ID starting with 02feb334135c15d75b63db4e74c13723b89249a2540f5d5e058c2686e8d4621c not found: ID does not exist" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.861032 4732 scope.go:117] "RemoveContainer" containerID="1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e" Oct 10 07:48:50 crc kubenswrapper[4732]: E1010 07:48:50.861312 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e\": container with ID starting with 1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e not found: ID does not exist" containerID="1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e" Oct 10 07:48:50 crc kubenswrapper[4732]: I1010 07:48:50.861331 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e"} err="failed to get container status \"1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e\": rpc error: code = NotFound desc = could not find container \"1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e\": container with ID starting with 1f3245b34200368f58577639df6647524fcfbc1d6d812b3242d8f1560ef0c96e not found: ID does not exist" Oct 10 07:48:51 crc kubenswrapper[4732]: I1010 07:48:51.673440 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b622449c-0212-41d8-b378-3f81937ca846" path="/var/lib/kubelet/pods/b622449c-0212-41d8-b378-3f81937ca846/volumes" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.919405 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n77zh"] Oct 10 07:49:24 crc kubenswrapper[4732]: E1010 07:49:24.921249 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="registry-server" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.921278 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="registry-server" Oct 10 07:49:24 crc kubenswrapper[4732]: E1010 07:49:24.921295 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="extract-utilities" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.921306 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="extract-utilities" Oct 10 07:49:24 crc kubenswrapper[4732]: E1010 07:49:24.921328 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="extract-content" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.921337 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="extract-content" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.921539 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b622449c-0212-41d8-b378-3f81937ca846" containerName="registry-server" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.922892 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.934381 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n77zh"] Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.965482 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-utilities\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.965798 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-catalog-content\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:24 crc kubenswrapper[4732]: I1010 07:49:24.966069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82qq8\" (UniqueName: \"kubernetes.io/projected/15fca5f0-b5e1-4a01-9fde-a342824b4594-kube-api-access-82qq8\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.067616 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82qq8\" (UniqueName: \"kubernetes.io/projected/15fca5f0-b5e1-4a01-9fde-a342824b4594-kube-api-access-82qq8\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.067731 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-utilities\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.067780 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-catalog-content\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.068209 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-utilities\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.068320 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-catalog-content\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.100599 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82qq8\" (UniqueName: \"kubernetes.io/projected/15fca5f0-b5e1-4a01-9fde-a342824b4594-kube-api-access-82qq8\") pod \"redhat-operators-n77zh\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.264767 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.356203 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.356863 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:49:25 crc kubenswrapper[4732]: I1010 07:49:25.739570 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n77zh"] Oct 10 07:49:26 crc kubenswrapper[4732]: I1010 07:49:26.139442 4732 generic.go:334] "Generic (PLEG): container finished" podID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerID="952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9" exitCode=0 Oct 10 07:49:26 crc kubenswrapper[4732]: I1010 07:49:26.139489 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n77zh" event={"ID":"15fca5f0-b5e1-4a01-9fde-a342824b4594","Type":"ContainerDied","Data":"952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9"} Oct 10 07:49:26 crc kubenswrapper[4732]: I1010 07:49:26.140578 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n77zh" event={"ID":"15fca5f0-b5e1-4a01-9fde-a342824b4594","Type":"ContainerStarted","Data":"ac0e600c5b49a87bad621ed02b0594d12bde2ebdc02be1c963a86152d22f7a74"} Oct 10 07:49:27 crc kubenswrapper[4732]: I1010 07:49:27.153879 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n77zh" event={"ID":"15fca5f0-b5e1-4a01-9fde-a342824b4594","Type":"ContainerStarted","Data":"b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7"} Oct 10 07:49:28 crc kubenswrapper[4732]: I1010 07:49:28.167458 4732 generic.go:334] "Generic (PLEG): container finished" podID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerID="b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7" exitCode=0 Oct 10 07:49:28 crc kubenswrapper[4732]: I1010 07:49:28.167538 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n77zh" event={"ID":"15fca5f0-b5e1-4a01-9fde-a342824b4594","Type":"ContainerDied","Data":"b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7"} Oct 10 07:49:29 crc kubenswrapper[4732]: I1010 07:49:29.178356 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n77zh" event={"ID":"15fca5f0-b5e1-4a01-9fde-a342824b4594","Type":"ContainerStarted","Data":"2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211"} Oct 10 07:49:29 crc kubenswrapper[4732]: I1010 07:49:29.206736 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n77zh" podStartSLOduration=2.6879215690000002 podStartE2EDuration="5.206717417s" podCreationTimestamp="2025-10-10 07:49:24 +0000 UTC" firstStartedPulling="2025-10-10 07:49:26.140567378 +0000 UTC m=+3493.210158619" lastFinishedPulling="2025-10-10 07:49:28.659363186 +0000 UTC m=+3495.728954467" observedRunningTime="2025-10-10 07:49:29.20236879 +0000 UTC m=+3496.271960071" watchObservedRunningTime="2025-10-10 07:49:29.206717417 +0000 UTC m=+3496.276308668" Oct 10 07:49:35 crc kubenswrapper[4732]: I1010 07:49:35.265374 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:35 crc kubenswrapper[4732]: I1010 07:49:35.267564 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:35 crc kubenswrapper[4732]: I1010 07:49:35.334088 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:36 crc kubenswrapper[4732]: I1010 07:49:36.312286 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:36 crc kubenswrapper[4732]: I1010 07:49:36.377793 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n77zh"] Oct 10 07:49:38 crc kubenswrapper[4732]: I1010 07:49:38.261719 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n77zh" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="registry-server" containerID="cri-o://2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211" gracePeriod=2 Oct 10 07:49:39 crc kubenswrapper[4732]: I1010 07:49:39.879469 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:39 crc kubenswrapper[4732]: I1010 07:49:39.905651 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-catalog-content\") pod \"15fca5f0-b5e1-4a01-9fde-a342824b4594\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " Oct 10 07:49:39 crc kubenswrapper[4732]: I1010 07:49:39.905774 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-utilities\") pod \"15fca5f0-b5e1-4a01-9fde-a342824b4594\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " Oct 10 07:49:39 crc kubenswrapper[4732]: I1010 07:49:39.905871 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82qq8\" (UniqueName: \"kubernetes.io/projected/15fca5f0-b5e1-4a01-9fde-a342824b4594-kube-api-access-82qq8\") pod \"15fca5f0-b5e1-4a01-9fde-a342824b4594\" (UID: \"15fca5f0-b5e1-4a01-9fde-a342824b4594\") " Oct 10 07:49:39 crc kubenswrapper[4732]: I1010 07:49:39.908238 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-utilities" (OuterVolumeSpecName: "utilities") pod "15fca5f0-b5e1-4a01-9fde-a342824b4594" (UID: "15fca5f0-b5e1-4a01-9fde-a342824b4594"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:49:39 crc kubenswrapper[4732]: I1010 07:49:39.919232 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fca5f0-b5e1-4a01-9fde-a342824b4594-kube-api-access-82qq8" (OuterVolumeSpecName: "kube-api-access-82qq8") pod "15fca5f0-b5e1-4a01-9fde-a342824b4594" (UID: "15fca5f0-b5e1-4a01-9fde-a342824b4594"). InnerVolumeSpecName "kube-api-access-82qq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.001987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15fca5f0-b5e1-4a01-9fde-a342824b4594" (UID: "15fca5f0-b5e1-4a01-9fde-a342824b4594"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.007708 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.007748 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82qq8\" (UniqueName: \"kubernetes.io/projected/15fca5f0-b5e1-4a01-9fde-a342824b4594-kube-api-access-82qq8\") on node \"crc\" DevicePath \"\"" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.007770 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15fca5f0-b5e1-4a01-9fde-a342824b4594-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.280837 4732 generic.go:334] "Generic (PLEG): container finished" podID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerID="2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211" exitCode=0 Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.280918 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n77zh" event={"ID":"15fca5f0-b5e1-4a01-9fde-a342824b4594","Type":"ContainerDied","Data":"2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211"} Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.280963 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n77zh" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.280988 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n77zh" event={"ID":"15fca5f0-b5e1-4a01-9fde-a342824b4594","Type":"ContainerDied","Data":"ac0e600c5b49a87bad621ed02b0594d12bde2ebdc02be1c963a86152d22f7a74"} Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.281023 4732 scope.go:117] "RemoveContainer" containerID="2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.312913 4732 scope.go:117] "RemoveContainer" containerID="b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.336493 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n77zh"] Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.340759 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n77zh"] Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.373683 4732 scope.go:117] "RemoveContainer" containerID="952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.396664 4732 scope.go:117] "RemoveContainer" containerID="2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211" Oct 10 07:49:40 crc kubenswrapper[4732]: E1010 07:49:40.397371 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211\": container with ID starting with 2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211 not found: ID does not exist" containerID="2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.397422 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211"} err="failed to get container status \"2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211\": rpc error: code = NotFound desc = could not find container \"2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211\": container with ID starting with 2a67790258d834b6425cc03e291716ce5ac366cddef39867f6db05d2649ea211 not found: ID does not exist" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.397459 4732 scope.go:117] "RemoveContainer" containerID="b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7" Oct 10 07:49:40 crc kubenswrapper[4732]: E1010 07:49:40.397933 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7\": container with ID starting with b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7 not found: ID does not exist" containerID="b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.398064 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7"} err="failed to get container status \"b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7\": rpc error: code = NotFound desc = could not find container \"b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7\": container with ID starting with b25030e34fc33d482a55d536dae24bd966d77eff713f7608261d5e0e176405b7 not found: ID does not exist" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.398111 4732 scope.go:117] "RemoveContainer" containerID="952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9" Oct 10 07:49:40 crc kubenswrapper[4732]: E1010 07:49:40.399863 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9\": container with ID starting with 952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9 not found: ID does not exist" containerID="952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9" Oct 10 07:49:40 crc kubenswrapper[4732]: I1010 07:49:40.399899 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9"} err="failed to get container status \"952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9\": rpc error: code = NotFound desc = could not find container \"952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9\": container with ID starting with 952507db3cf93f1b16659e3d35f9dc59276b045027bebd8fe8b3beb87166baa9 not found: ID does not exist" Oct 10 07:49:41 crc kubenswrapper[4732]: I1010 07:49:41.677987 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" path="/var/lib/kubelet/pods/15fca5f0-b5e1-4a01-9fde-a342824b4594/volumes" Oct 10 07:49:55 crc kubenswrapper[4732]: I1010 07:49:55.356226 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:49:55 crc kubenswrapper[4732]: I1010 07:49:55.357008 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.355521 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.357306 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.357384 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.358312 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.358408 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" gracePeriod=600 Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.730623 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" exitCode=0 Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.730719 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2"} Oct 10 07:50:25 crc kubenswrapper[4732]: I1010 07:50:25.730798 4732 scope.go:117] "RemoveContainer" containerID="08c62da30f16093e913de96793a90e2b44521000f060ca70d35236045d5b3004" Oct 10 07:50:26 crc kubenswrapper[4732]: E1010 07:50:26.016615 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:50:26 crc kubenswrapper[4732]: I1010 07:50:26.750666 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:50:26 crc kubenswrapper[4732]: E1010 07:50:26.751554 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:50:41 crc kubenswrapper[4732]: I1010 07:50:41.660956 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:50:41 crc kubenswrapper[4732]: E1010 07:50:41.663899 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:50:52 crc kubenswrapper[4732]: I1010 07:50:52.660684 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:50:52 crc kubenswrapper[4732]: E1010 07:50:52.662555 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:51:03 crc kubenswrapper[4732]: I1010 07:51:03.680534 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:51:03 crc kubenswrapper[4732]: E1010 07:51:03.683663 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:51:18 crc kubenswrapper[4732]: I1010 07:51:18.660455 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:51:18 crc kubenswrapper[4732]: E1010 07:51:18.661524 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:51:29 crc kubenswrapper[4732]: I1010 07:51:29.660619 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:51:29 crc kubenswrapper[4732]: E1010 07:51:29.661628 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:51:41 crc kubenswrapper[4732]: I1010 07:51:41.661220 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:51:41 crc kubenswrapper[4732]: E1010 07:51:41.664237 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:51:53 crc kubenswrapper[4732]: I1010 07:51:53.667812 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:51:53 crc kubenswrapper[4732]: E1010 07:51:53.668918 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:52:07 crc kubenswrapper[4732]: I1010 07:52:07.660235 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:52:07 crc kubenswrapper[4732]: E1010 07:52:07.661015 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:52:22 crc kubenswrapper[4732]: I1010 07:52:22.660668 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:52:22 crc kubenswrapper[4732]: E1010 07:52:22.661857 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:52:35 crc kubenswrapper[4732]: I1010 07:52:35.661629 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:52:35 crc kubenswrapper[4732]: E1010 07:52:35.662519 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.609032 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rzmjl"] Oct 10 07:52:48 crc kubenswrapper[4732]: E1010 07:52:48.610386 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="extract-utilities" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.610409 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="extract-utilities" Oct 10 07:52:48 crc kubenswrapper[4732]: E1010 07:52:48.610428 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="extract-content" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.610440 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="extract-content" Oct 10 07:52:48 crc kubenswrapper[4732]: E1010 07:52:48.610463 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="registry-server" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.610474 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="registry-server" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.610740 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fca5f0-b5e1-4a01-9fde-a342824b4594" containerName="registry-server" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.612441 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.643391 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzmjl"] Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.672720 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:52:48 crc kubenswrapper[4732]: E1010 07:52:48.672943 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.728378 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-utilities\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.728723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-catalog-content\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.728921 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/dcac8178-7b04-4961-91b0-8830f7c44f3c-kube-api-access-wjvdl\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.830539 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/dcac8178-7b04-4961-91b0-8830f7c44f3c-kube-api-access-wjvdl\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.830685 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-utilities\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.830790 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-catalog-content\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.831257 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-utilities\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.831469 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-catalog-content\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.857380 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/dcac8178-7b04-4961-91b0-8830f7c44f3c-kube-api-access-wjvdl\") pod \"community-operators-rzmjl\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:48 crc kubenswrapper[4732]: I1010 07:52:48.934130 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:49 crc kubenswrapper[4732]: I1010 07:52:49.214639 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzmjl"] Oct 10 07:52:50 crc kubenswrapper[4732]: I1010 07:52:50.150960 4732 generic.go:334] "Generic (PLEG): container finished" podID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerID="2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b" exitCode=0 Oct 10 07:52:50 crc kubenswrapper[4732]: I1010 07:52:50.151246 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzmjl" event={"ID":"dcac8178-7b04-4961-91b0-8830f7c44f3c","Type":"ContainerDied","Data":"2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b"} Oct 10 07:52:50 crc kubenswrapper[4732]: I1010 07:52:50.151273 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzmjl" event={"ID":"dcac8178-7b04-4961-91b0-8830f7c44f3c","Type":"ContainerStarted","Data":"2ba8b3cf2ff3c07155532a645b22a3668cd1ae7b97a9e74dd6fc6e1aff4e71de"} Oct 10 07:52:50 crc kubenswrapper[4732]: I1010 07:52:50.155595 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:52:51 crc kubenswrapper[4732]: I1010 07:52:51.160988 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzmjl" event={"ID":"dcac8178-7b04-4961-91b0-8830f7c44f3c","Type":"ContainerStarted","Data":"bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52"} Oct 10 07:52:52 crc kubenswrapper[4732]: I1010 07:52:52.171833 4732 generic.go:334] "Generic (PLEG): container finished" podID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerID="bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52" exitCode=0 Oct 10 07:52:52 crc kubenswrapper[4732]: I1010 07:52:52.171926 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzmjl" event={"ID":"dcac8178-7b04-4961-91b0-8830f7c44f3c","Type":"ContainerDied","Data":"bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52"} Oct 10 07:52:53 crc kubenswrapper[4732]: I1010 07:52:53.187900 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzmjl" event={"ID":"dcac8178-7b04-4961-91b0-8830f7c44f3c","Type":"ContainerStarted","Data":"246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46"} Oct 10 07:52:53 crc kubenswrapper[4732]: I1010 07:52:53.232662 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rzmjl" podStartSLOduration=2.697728811 podStartE2EDuration="5.232633634s" podCreationTimestamp="2025-10-10 07:52:48 +0000 UTC" firstStartedPulling="2025-10-10 07:52:50.155224143 +0000 UTC m=+3697.224815404" lastFinishedPulling="2025-10-10 07:52:52.690128986 +0000 UTC m=+3699.759720227" observedRunningTime="2025-10-10 07:52:53.222469618 +0000 UTC m=+3700.292060949" watchObservedRunningTime="2025-10-10 07:52:53.232633634 +0000 UTC m=+3700.302224915" Oct 10 07:52:58 crc kubenswrapper[4732]: I1010 07:52:58.934954 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:58 crc kubenswrapper[4732]: I1010 07:52:58.935731 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:59 crc kubenswrapper[4732]: I1010 07:52:59.008444 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:59 crc kubenswrapper[4732]: I1010 07:52:59.300819 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:52:59 crc kubenswrapper[4732]: I1010 07:52:59.361650 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzmjl"] Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.264575 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rzmjl" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="registry-server" containerID="cri-o://246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46" gracePeriod=2 Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.709309 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.845037 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-catalog-content\") pod \"dcac8178-7b04-4961-91b0-8830f7c44f3c\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.845073 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/dcac8178-7b04-4961-91b0-8830f7c44f3c-kube-api-access-wjvdl\") pod \"dcac8178-7b04-4961-91b0-8830f7c44f3c\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.845121 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-utilities\") pod \"dcac8178-7b04-4961-91b0-8830f7c44f3c\" (UID: \"dcac8178-7b04-4961-91b0-8830f7c44f3c\") " Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.846247 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-utilities" (OuterVolumeSpecName: "utilities") pod "dcac8178-7b04-4961-91b0-8830f7c44f3c" (UID: "dcac8178-7b04-4961-91b0-8830f7c44f3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.853465 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcac8178-7b04-4961-91b0-8830f7c44f3c-kube-api-access-wjvdl" (OuterVolumeSpecName: "kube-api-access-wjvdl") pod "dcac8178-7b04-4961-91b0-8830f7c44f3c" (UID: "dcac8178-7b04-4961-91b0-8830f7c44f3c"). InnerVolumeSpecName "kube-api-access-wjvdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.946957 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjvdl\" (UniqueName: \"kubernetes.io/projected/dcac8178-7b04-4961-91b0-8830f7c44f3c-kube-api-access-wjvdl\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:01 crc kubenswrapper[4732]: I1010 07:53:01.948000 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.280055 4732 generic.go:334] "Generic (PLEG): container finished" podID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerID="246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46" exitCode=0 Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.280166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzmjl" event={"ID":"dcac8178-7b04-4961-91b0-8830f7c44f3c","Type":"ContainerDied","Data":"246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46"} Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.280268 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzmjl" event={"ID":"dcac8178-7b04-4961-91b0-8830f7c44f3c","Type":"ContainerDied","Data":"2ba8b3cf2ff3c07155532a645b22a3668cd1ae7b97a9e74dd6fc6e1aff4e71de"} Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.280335 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzmjl" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.280344 4732 scope.go:117] "RemoveContainer" containerID="246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.315313 4732 scope.go:117] "RemoveContainer" containerID="bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.355131 4732 scope.go:117] "RemoveContainer" containerID="2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.390343 4732 scope.go:117] "RemoveContainer" containerID="246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46" Oct 10 07:53:02 crc kubenswrapper[4732]: E1010 07:53:02.390941 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46\": container with ID starting with 246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46 not found: ID does not exist" containerID="246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.391041 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46"} err="failed to get container status \"246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46\": rpc error: code = NotFound desc = could not find container \"246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46\": container with ID starting with 246897d684bacb4620bb5cdf33aa8d940b397e78e58cb40f1580ba60f96b1f46 not found: ID does not exist" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.391085 4732 scope.go:117] "RemoveContainer" containerID="bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52" Oct 10 07:53:02 crc kubenswrapper[4732]: E1010 07:53:02.391637 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52\": container with ID starting with bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52 not found: ID does not exist" containerID="bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.391710 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52"} err="failed to get container status \"bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52\": rpc error: code = NotFound desc = could not find container \"bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52\": container with ID starting with bb2319a47422ba4274002696b959da401f79c93cb1a7b6da6aa2764c81e2dd52 not found: ID does not exist" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.391742 4732 scope.go:117] "RemoveContainer" containerID="2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b" Oct 10 07:53:02 crc kubenswrapper[4732]: E1010 07:53:02.392087 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b\": container with ID starting with 2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b not found: ID does not exist" containerID="2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.392159 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b"} err="failed to get container status \"2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b\": rpc error: code = NotFound desc = could not find container \"2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b\": container with ID starting with 2d802051721f2e0b643f23c0595f90cb680cb7539fb2d13707d55a4d34ceec4b not found: ID does not exist" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.597174 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dcac8178-7b04-4961-91b0-8830f7c44f3c" (UID: "dcac8178-7b04-4961-91b0-8830f7c44f3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.659834 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcac8178-7b04-4961-91b0-8830f7c44f3c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.936795 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzmjl"] Oct 10 07:53:02 crc kubenswrapper[4732]: I1010 07:53:02.943799 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rzmjl"] Oct 10 07:53:03 crc kubenswrapper[4732]: I1010 07:53:03.668609 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:53:03 crc kubenswrapper[4732]: E1010 07:53:03.669635 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:53:03 crc kubenswrapper[4732]: I1010 07:53:03.678000 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" path="/var/lib/kubelet/pods/dcac8178-7b04-4961-91b0-8830f7c44f3c/volumes" Oct 10 07:53:17 crc kubenswrapper[4732]: I1010 07:53:17.660674 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:53:17 crc kubenswrapper[4732]: E1010 07:53:17.664766 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:53:28 crc kubenswrapper[4732]: I1010 07:53:28.660929 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:53:28 crc kubenswrapper[4732]: E1010 07:53:28.661974 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:53:43 crc kubenswrapper[4732]: I1010 07:53:43.668771 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:53:43 crc kubenswrapper[4732]: E1010 07:53:43.670024 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:53:54 crc kubenswrapper[4732]: I1010 07:53:54.661014 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:53:54 crc kubenswrapper[4732]: E1010 07:53:54.662296 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:54:07 crc kubenswrapper[4732]: I1010 07:54:07.661283 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:54:07 crc kubenswrapper[4732]: E1010 07:54:07.662244 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:54:19 crc kubenswrapper[4732]: I1010 07:54:19.660568 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:54:19 crc kubenswrapper[4732]: E1010 07:54:19.661576 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:54:30 crc kubenswrapper[4732]: I1010 07:54:30.661298 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:54:30 crc kubenswrapper[4732]: E1010 07:54:30.662389 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:54:45 crc kubenswrapper[4732]: I1010 07:54:45.661054 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:54:45 crc kubenswrapper[4732]: E1010 07:54:45.662392 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:54:57 crc kubenswrapper[4732]: I1010 07:54:57.659953 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:54:57 crc kubenswrapper[4732]: E1010 07:54:57.660681 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:55:09 crc kubenswrapper[4732]: I1010 07:55:09.660484 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:55:09 crc kubenswrapper[4732]: E1010 07:55:09.661419 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:55:22 crc kubenswrapper[4732]: I1010 07:55:22.660514 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:55:22 crc kubenswrapper[4732]: E1010 07:55:22.662112 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 07:55:36 crc kubenswrapper[4732]: I1010 07:55:36.660857 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:55:37 crc kubenswrapper[4732]: I1010 07:55:37.785333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"2c1cbee0cabb0dc89f4c124bf1a1fdd99fa47b775da4a9e795f4e601c5c7d64a"} Oct 10 07:57:55 crc kubenswrapper[4732]: I1010 07:57:55.356084 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:57:55 crc kubenswrapper[4732]: I1010 07:57:55.356864 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.212942 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kd8wq"] Oct 10 07:58:22 crc kubenswrapper[4732]: E1010 07:58:22.213854 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="extract-utilities" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.213870 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="extract-utilities" Oct 10 07:58:22 crc kubenswrapper[4732]: E1010 07:58:22.213886 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="extract-content" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.213894 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="extract-content" Oct 10 07:58:22 crc kubenswrapper[4732]: E1010 07:58:22.213915 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="registry-server" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.213923 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="registry-server" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.214115 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcac8178-7b04-4961-91b0-8830f7c44f3c" containerName="registry-server" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.215244 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.225763 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kd8wq"] Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.370601 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-utilities\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.370734 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczqn\" (UniqueName: \"kubernetes.io/projected/a6bdd67c-b031-435c-b05f-83143d78cc86-kube-api-access-bczqn\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.370903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-catalog-content\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.471979 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-utilities\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.472071 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bczqn\" (UniqueName: \"kubernetes.io/projected/a6bdd67c-b031-435c-b05f-83143d78cc86-kube-api-access-bczqn\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.472152 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-catalog-content\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.472683 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-utilities\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.472817 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-catalog-content\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.494088 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczqn\" (UniqueName: \"kubernetes.io/projected/a6bdd67c-b031-435c-b05f-83143d78cc86-kube-api-access-bczqn\") pod \"redhat-marketplace-kd8wq\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.551204 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:22 crc kubenswrapper[4732]: I1010 07:58:22.765921 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kd8wq"] Oct 10 07:58:23 crc kubenswrapper[4732]: I1010 07:58:23.388474 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerID="66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70" exitCode=0 Oct 10 07:58:23 crc kubenswrapper[4732]: I1010 07:58:23.388729 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kd8wq" event={"ID":"a6bdd67c-b031-435c-b05f-83143d78cc86","Type":"ContainerDied","Data":"66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70"} Oct 10 07:58:23 crc kubenswrapper[4732]: I1010 07:58:23.388814 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kd8wq" event={"ID":"a6bdd67c-b031-435c-b05f-83143d78cc86","Type":"ContainerStarted","Data":"cf6a04908ac94d8f8f55d4bf57f490d9de74ac08af38a082c4794359191c7e06"} Oct 10 07:58:23 crc kubenswrapper[4732]: I1010 07:58:23.391016 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 07:58:24 crc kubenswrapper[4732]: I1010 07:58:24.400115 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerID="5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c" exitCode=0 Oct 10 07:58:24 crc kubenswrapper[4732]: I1010 07:58:24.400504 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kd8wq" event={"ID":"a6bdd67c-b031-435c-b05f-83143d78cc86","Type":"ContainerDied","Data":"5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c"} Oct 10 07:58:25 crc kubenswrapper[4732]: I1010 07:58:25.355982 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:58:25 crc kubenswrapper[4732]: I1010 07:58:25.356079 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:58:25 crc kubenswrapper[4732]: I1010 07:58:25.411410 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kd8wq" event={"ID":"a6bdd67c-b031-435c-b05f-83143d78cc86","Type":"ContainerStarted","Data":"28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a"} Oct 10 07:58:25 crc kubenswrapper[4732]: I1010 07:58:25.438421 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kd8wq" podStartSLOduration=1.914276835 podStartE2EDuration="3.438397079s" podCreationTimestamp="2025-10-10 07:58:22 +0000 UTC" firstStartedPulling="2025-10-10 07:58:23.39065479 +0000 UTC m=+4030.460246041" lastFinishedPulling="2025-10-10 07:58:24.914775014 +0000 UTC m=+4031.984366285" observedRunningTime="2025-10-10 07:58:25.437948487 +0000 UTC m=+4032.507539728" watchObservedRunningTime="2025-10-10 07:58:25.438397079 +0000 UTC m=+4032.507988350" Oct 10 07:58:32 crc kubenswrapper[4732]: I1010 07:58:32.552000 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:32 crc kubenswrapper[4732]: I1010 07:58:32.552514 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:32 crc kubenswrapper[4732]: I1010 07:58:32.635916 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:33 crc kubenswrapper[4732]: I1010 07:58:33.542203 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:33 crc kubenswrapper[4732]: I1010 07:58:33.604579 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kd8wq"] Oct 10 07:58:35 crc kubenswrapper[4732]: I1010 07:58:35.509068 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kd8wq" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="registry-server" containerID="cri-o://28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a" gracePeriod=2 Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.270038 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.393651 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-catalog-content\") pod \"a6bdd67c-b031-435c-b05f-83143d78cc86\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.393913 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-utilities\") pod \"a6bdd67c-b031-435c-b05f-83143d78cc86\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.393965 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bczqn\" (UniqueName: \"kubernetes.io/projected/a6bdd67c-b031-435c-b05f-83143d78cc86-kube-api-access-bczqn\") pod \"a6bdd67c-b031-435c-b05f-83143d78cc86\" (UID: \"a6bdd67c-b031-435c-b05f-83143d78cc86\") " Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.394915 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-utilities" (OuterVolumeSpecName: "utilities") pod "a6bdd67c-b031-435c-b05f-83143d78cc86" (UID: "a6bdd67c-b031-435c-b05f-83143d78cc86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.400950 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bdd67c-b031-435c-b05f-83143d78cc86-kube-api-access-bczqn" (OuterVolumeSpecName: "kube-api-access-bczqn") pod "a6bdd67c-b031-435c-b05f-83143d78cc86" (UID: "a6bdd67c-b031-435c-b05f-83143d78cc86"). InnerVolumeSpecName "kube-api-access-bczqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.418683 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6bdd67c-b031-435c-b05f-83143d78cc86" (UID: "a6bdd67c-b031-435c-b05f-83143d78cc86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.495436 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.495476 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6bdd67c-b031-435c-b05f-83143d78cc86-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.495486 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bczqn\" (UniqueName: \"kubernetes.io/projected/a6bdd67c-b031-435c-b05f-83143d78cc86-kube-api-access-bczqn\") on node \"crc\" DevicePath \"\"" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.517712 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerID="28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a" exitCode=0 Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.517749 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kd8wq" event={"ID":"a6bdd67c-b031-435c-b05f-83143d78cc86","Type":"ContainerDied","Data":"28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a"} Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.517765 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kd8wq" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.517789 4732 scope.go:117] "RemoveContainer" containerID="28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.517776 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kd8wq" event={"ID":"a6bdd67c-b031-435c-b05f-83143d78cc86","Type":"ContainerDied","Data":"cf6a04908ac94d8f8f55d4bf57f490d9de74ac08af38a082c4794359191c7e06"} Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.538447 4732 scope.go:117] "RemoveContainer" containerID="5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.548906 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kd8wq"] Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.558401 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kd8wq"] Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.615272 4732 scope.go:117] "RemoveContainer" containerID="66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.635029 4732 scope.go:117] "RemoveContainer" containerID="28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a" Oct 10 07:58:36 crc kubenswrapper[4732]: E1010 07:58:36.635782 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a\": container with ID starting with 28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a not found: ID does not exist" containerID="28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.635923 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a"} err="failed to get container status \"28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a\": rpc error: code = NotFound desc = could not find container \"28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a\": container with ID starting with 28d8a2de72180bd1782c16adfac6057c9f070e94f34299a3f191a15fdc09fd7a not found: ID does not exist" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.636035 4732 scope.go:117] "RemoveContainer" containerID="5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c" Oct 10 07:58:36 crc kubenswrapper[4732]: E1010 07:58:36.636643 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c\": container with ID starting with 5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c not found: ID does not exist" containerID="5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.636683 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c"} err="failed to get container status \"5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c\": rpc error: code = NotFound desc = could not find container \"5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c\": container with ID starting with 5f670df100e74dd11760b3b262899816d8ef6224cc40000eb55e5537811ba03c not found: ID does not exist" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.636773 4732 scope.go:117] "RemoveContainer" containerID="66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70" Oct 10 07:58:36 crc kubenswrapper[4732]: E1010 07:58:36.637290 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70\": container with ID starting with 66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70 not found: ID does not exist" containerID="66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70" Oct 10 07:58:36 crc kubenswrapper[4732]: I1010 07:58:36.637366 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70"} err="failed to get container status \"66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70\": rpc error: code = NotFound desc = could not find container \"66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70\": container with ID starting with 66628e0bc17ca9e033945c44ace0a09aa90494d4ffe067b83fcdbc42c26f7a70 not found: ID does not exist" Oct 10 07:58:37 crc kubenswrapper[4732]: I1010 07:58:37.680499 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" path="/var/lib/kubelet/pods/a6bdd67c-b031-435c-b05f-83143d78cc86/volumes" Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.356567 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.357903 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.358004 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.359224 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c1cbee0cabb0dc89f4c124bf1a1fdd99fa47b775da4a9e795f4e601c5c7d64a"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.359340 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://2c1cbee0cabb0dc89f4c124bf1a1fdd99fa47b775da4a9e795f4e601c5c7d64a" gracePeriod=600 Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.693259 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="2c1cbee0cabb0dc89f4c124bf1a1fdd99fa47b775da4a9e795f4e601c5c7d64a" exitCode=0 Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.693315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"2c1cbee0cabb0dc89f4c124bf1a1fdd99fa47b775da4a9e795f4e601c5c7d64a"} Oct 10 07:58:55 crc kubenswrapper[4732]: I1010 07:58:55.693348 4732 scope.go:117] "RemoveContainer" containerID="fa39a4c004aac08518ee02139a86ce7e1430cf2723360981ac910e665ab5e5a2" Oct 10 07:58:56 crc kubenswrapper[4732]: I1010 07:58:56.707778 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629"} Oct 10 07:58:57 crc kubenswrapper[4732]: I1010 07:58:57.997987 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vqzd"] Oct 10 07:58:58 crc kubenswrapper[4732]: E1010 07:58:58.000718 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="registry-server" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.000986 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="registry-server" Oct 10 07:58:58 crc kubenswrapper[4732]: E1010 07:58:58.001318 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="extract-utilities" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.001554 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="extract-utilities" Oct 10 07:58:58 crc kubenswrapper[4732]: E1010 07:58:58.001681 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="extract-content" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.002741 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="extract-content" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.003313 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6bdd67c-b031-435c-b05f-83143d78cc86" containerName="registry-server" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.007502 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.008967 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vqzd"] Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.118191 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-utilities\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.118273 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-catalog-content\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.118367 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtbvp\" (UniqueName: \"kubernetes.io/projected/e317f829-34a8-4c80-89a8-91b4ee65c51c-kube-api-access-jtbvp\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.219954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-utilities\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.220015 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-catalog-content\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.220067 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtbvp\" (UniqueName: \"kubernetes.io/projected/e317f829-34a8-4c80-89a8-91b4ee65c51c-kube-api-access-jtbvp\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.220506 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-utilities\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.220729 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-catalog-content\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.246001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtbvp\" (UniqueName: \"kubernetes.io/projected/e317f829-34a8-4c80-89a8-91b4ee65c51c-kube-api-access-jtbvp\") pod \"certified-operators-7vqzd\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.342107 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:58:58 crc kubenswrapper[4732]: I1010 07:58:58.872790 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vqzd"] Oct 10 07:58:59 crc kubenswrapper[4732]: I1010 07:58:59.729901 4732 generic.go:334] "Generic (PLEG): container finished" podID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerID="9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31" exitCode=0 Oct 10 07:58:59 crc kubenswrapper[4732]: I1010 07:58:59.730026 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vqzd" event={"ID":"e317f829-34a8-4c80-89a8-91b4ee65c51c","Type":"ContainerDied","Data":"9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31"} Oct 10 07:58:59 crc kubenswrapper[4732]: I1010 07:58:59.730271 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vqzd" event={"ID":"e317f829-34a8-4c80-89a8-91b4ee65c51c","Type":"ContainerStarted","Data":"f88cc9ecec407fabe066958744278566b6908fe0bdf5a6916c1b181a935ef724"} Oct 10 07:59:00 crc kubenswrapper[4732]: I1010 07:59:00.744857 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vqzd" event={"ID":"e317f829-34a8-4c80-89a8-91b4ee65c51c","Type":"ContainerStarted","Data":"692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6"} Oct 10 07:59:01 crc kubenswrapper[4732]: I1010 07:59:01.760350 4732 generic.go:334] "Generic (PLEG): container finished" podID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerID="692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6" exitCode=0 Oct 10 07:59:01 crc kubenswrapper[4732]: I1010 07:59:01.760409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vqzd" event={"ID":"e317f829-34a8-4c80-89a8-91b4ee65c51c","Type":"ContainerDied","Data":"692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6"} Oct 10 07:59:02 crc kubenswrapper[4732]: I1010 07:59:02.772191 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vqzd" event={"ID":"e317f829-34a8-4c80-89a8-91b4ee65c51c","Type":"ContainerStarted","Data":"ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896"} Oct 10 07:59:02 crc kubenswrapper[4732]: I1010 07:59:02.804627 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vqzd" podStartSLOduration=3.170496463 podStartE2EDuration="5.804610033s" podCreationTimestamp="2025-10-10 07:58:57 +0000 UTC" firstStartedPulling="2025-10-10 07:58:59.732507556 +0000 UTC m=+4066.802098797" lastFinishedPulling="2025-10-10 07:59:02.366621086 +0000 UTC m=+4069.436212367" observedRunningTime="2025-10-10 07:59:02.799383221 +0000 UTC m=+4069.868974482" watchObservedRunningTime="2025-10-10 07:59:02.804610033 +0000 UTC m=+4069.874201294" Oct 10 07:59:08 crc kubenswrapper[4732]: I1010 07:59:08.343230 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:59:08 crc kubenswrapper[4732]: I1010 07:59:08.344937 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:59:08 crc kubenswrapper[4732]: I1010 07:59:08.420175 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:59:08 crc kubenswrapper[4732]: I1010 07:59:08.905441 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:59:08 crc kubenswrapper[4732]: I1010 07:59:08.959466 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vqzd"] Oct 10 07:59:10 crc kubenswrapper[4732]: I1010 07:59:10.848633 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vqzd" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="registry-server" containerID="cri-o://ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896" gracePeriod=2 Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.512277 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.631344 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtbvp\" (UniqueName: \"kubernetes.io/projected/e317f829-34a8-4c80-89a8-91b4ee65c51c-kube-api-access-jtbvp\") pod \"e317f829-34a8-4c80-89a8-91b4ee65c51c\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.631460 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-utilities\") pod \"e317f829-34a8-4c80-89a8-91b4ee65c51c\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.631561 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-catalog-content\") pod \"e317f829-34a8-4c80-89a8-91b4ee65c51c\" (UID: \"e317f829-34a8-4c80-89a8-91b4ee65c51c\") " Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.632562 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-utilities" (OuterVolumeSpecName: "utilities") pod "e317f829-34a8-4c80-89a8-91b4ee65c51c" (UID: "e317f829-34a8-4c80-89a8-91b4ee65c51c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.646121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e317f829-34a8-4c80-89a8-91b4ee65c51c-kube-api-access-jtbvp" (OuterVolumeSpecName: "kube-api-access-jtbvp") pod "e317f829-34a8-4c80-89a8-91b4ee65c51c" (UID: "e317f829-34a8-4c80-89a8-91b4ee65c51c"). InnerVolumeSpecName "kube-api-access-jtbvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.705119 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e317f829-34a8-4c80-89a8-91b4ee65c51c" (UID: "e317f829-34a8-4c80-89a8-91b4ee65c51c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.732752 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.733069 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtbvp\" (UniqueName: \"kubernetes.io/projected/e317f829-34a8-4c80-89a8-91b4ee65c51c-kube-api-access-jtbvp\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.733283 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e317f829-34a8-4c80-89a8-91b4ee65c51c-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.860028 4732 generic.go:334] "Generic (PLEG): container finished" podID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerID="ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896" exitCode=0 Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.860075 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vqzd" event={"ID":"e317f829-34a8-4c80-89a8-91b4ee65c51c","Type":"ContainerDied","Data":"ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896"} Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.860093 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vqzd" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.860106 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vqzd" event={"ID":"e317f829-34a8-4c80-89a8-91b4ee65c51c","Type":"ContainerDied","Data":"f88cc9ecec407fabe066958744278566b6908fe0bdf5a6916c1b181a935ef724"} Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.860130 4732 scope.go:117] "RemoveContainer" containerID="ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.886902 4732 scope.go:117] "RemoveContainer" containerID="692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.904078 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vqzd"] Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.906283 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vqzd"] Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.918261 4732 scope.go:117] "RemoveContainer" containerID="9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.949403 4732 scope.go:117] "RemoveContainer" containerID="ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896" Oct 10 07:59:11 crc kubenswrapper[4732]: E1010 07:59:11.950306 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896\": container with ID starting with ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896 not found: ID does not exist" containerID="ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.950388 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896"} err="failed to get container status \"ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896\": rpc error: code = NotFound desc = could not find container \"ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896\": container with ID starting with ef7de1cf91296ac8c28692541ec2895f932b6571aba1377c93fddbc07c9d7896 not found: ID does not exist" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.950478 4732 scope.go:117] "RemoveContainer" containerID="692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6" Oct 10 07:59:11 crc kubenswrapper[4732]: E1010 07:59:11.951015 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6\": container with ID starting with 692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6 not found: ID does not exist" containerID="692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.951065 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6"} err="failed to get container status \"692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6\": rpc error: code = NotFound desc = could not find container \"692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6\": container with ID starting with 692d395e0b5ee0167968bfb008937de3398a4a230cd12b2b516430cf33ab8bc6 not found: ID does not exist" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.951084 4732 scope.go:117] "RemoveContainer" containerID="9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31" Oct 10 07:59:11 crc kubenswrapper[4732]: E1010 07:59:11.951521 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31\": container with ID starting with 9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31 not found: ID does not exist" containerID="9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31" Oct 10 07:59:11 crc kubenswrapper[4732]: I1010 07:59:11.951576 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31"} err="failed to get container status \"9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31\": rpc error: code = NotFound desc = could not find container \"9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31\": container with ID starting with 9185186a254262370114df5e089f59facaee360dc67d1db16434e284fdf7aa31 not found: ID does not exist" Oct 10 07:59:13 crc kubenswrapper[4732]: I1010 07:59:13.677522 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" path="/var/lib/kubelet/pods/e317f829-34a8-4c80-89a8-91b4ee65c51c/volumes" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.173335 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx"] Oct 10 08:00:00 crc kubenswrapper[4732]: E1010 08:00:00.174565 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="extract-content" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.174589 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="extract-content" Oct 10 08:00:00 crc kubenswrapper[4732]: E1010 08:00:00.174611 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="registry-server" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.174623 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="registry-server" Oct 10 08:00:00 crc kubenswrapper[4732]: E1010 08:00:00.174654 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="extract-utilities" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.174667 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="extract-utilities" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.175008 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e317f829-34a8-4c80-89a8-91b4ee65c51c" containerName="registry-server" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.175916 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.182483 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.182584 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.195944 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx"] Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.293605 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d03988de-67cd-458f-98e5-8913168773f4-secret-volume\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.294004 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d03988de-67cd-458f-98e5-8913168773f4-config-volume\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.294190 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jmk9\" (UniqueName: \"kubernetes.io/projected/d03988de-67cd-458f-98e5-8913168773f4-kube-api-access-6jmk9\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.395523 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d03988de-67cd-458f-98e5-8913168773f4-secret-volume\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.395607 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d03988de-67cd-458f-98e5-8913168773f4-config-volume\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.395636 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jmk9\" (UniqueName: \"kubernetes.io/projected/d03988de-67cd-458f-98e5-8913168773f4-kube-api-access-6jmk9\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.396922 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d03988de-67cd-458f-98e5-8913168773f4-config-volume\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.405517 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d03988de-67cd-458f-98e5-8913168773f4-secret-volume\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.415905 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jmk9\" (UniqueName: \"kubernetes.io/projected/d03988de-67cd-458f-98e5-8913168773f4-kube-api-access-6jmk9\") pod \"collect-profiles-29334720-k2tdx\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.518918 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:00 crc kubenswrapper[4732]: I1010 08:00:00.947986 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx"] Oct 10 08:00:01 crc kubenswrapper[4732]: I1010 08:00:01.302547 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" event={"ID":"d03988de-67cd-458f-98e5-8913168773f4","Type":"ContainerStarted","Data":"238c2ff99494f459911c656fb1f271494b04aa49f3b0917b158c46854a3ac05b"} Oct 10 08:00:01 crc kubenswrapper[4732]: I1010 08:00:01.302591 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" event={"ID":"d03988de-67cd-458f-98e5-8913168773f4","Type":"ContainerStarted","Data":"48cf7add79610e50a723c809f54f4fa5d8c479d3710d6c4a0ddce2fb29997d56"} Oct 10 08:00:02 crc kubenswrapper[4732]: I1010 08:00:02.314636 4732 generic.go:334] "Generic (PLEG): container finished" podID="d03988de-67cd-458f-98e5-8913168773f4" containerID="238c2ff99494f459911c656fb1f271494b04aa49f3b0917b158c46854a3ac05b" exitCode=0 Oct 10 08:00:02 crc kubenswrapper[4732]: I1010 08:00:02.314778 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" event={"ID":"d03988de-67cd-458f-98e5-8913168773f4","Type":"ContainerDied","Data":"238c2ff99494f459911c656fb1f271494b04aa49f3b0917b158c46854a3ac05b"} Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.695135 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.850587 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d03988de-67cd-458f-98e5-8913168773f4-secret-volume\") pod \"d03988de-67cd-458f-98e5-8913168773f4\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.850797 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jmk9\" (UniqueName: \"kubernetes.io/projected/d03988de-67cd-458f-98e5-8913168773f4-kube-api-access-6jmk9\") pod \"d03988de-67cd-458f-98e5-8913168773f4\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.850847 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d03988de-67cd-458f-98e5-8913168773f4-config-volume\") pod \"d03988de-67cd-458f-98e5-8913168773f4\" (UID: \"d03988de-67cd-458f-98e5-8913168773f4\") " Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.851566 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d03988de-67cd-458f-98e5-8913168773f4-config-volume" (OuterVolumeSpecName: "config-volume") pod "d03988de-67cd-458f-98e5-8913168773f4" (UID: "d03988de-67cd-458f-98e5-8913168773f4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.859512 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d03988de-67cd-458f-98e5-8913168773f4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d03988de-67cd-458f-98e5-8913168773f4" (UID: "d03988de-67cd-458f-98e5-8913168773f4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.859947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03988de-67cd-458f-98e5-8913168773f4-kube-api-access-6jmk9" (OuterVolumeSpecName: "kube-api-access-6jmk9") pod "d03988de-67cd-458f-98e5-8913168773f4" (UID: "d03988de-67cd-458f-98e5-8913168773f4"). InnerVolumeSpecName "kube-api-access-6jmk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.952041 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jmk9\" (UniqueName: \"kubernetes.io/projected/d03988de-67cd-458f-98e5-8913168773f4-kube-api-access-6jmk9\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.952092 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d03988de-67cd-458f-98e5-8913168773f4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:03 crc kubenswrapper[4732]: I1010 08:00:03.952108 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d03988de-67cd-458f-98e5-8913168773f4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:00:04 crc kubenswrapper[4732]: I1010 08:00:04.335261 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" event={"ID":"d03988de-67cd-458f-98e5-8913168773f4","Type":"ContainerDied","Data":"48cf7add79610e50a723c809f54f4fa5d8c479d3710d6c4a0ddce2fb29997d56"} Oct 10 08:00:04 crc kubenswrapper[4732]: I1010 08:00:04.335312 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48cf7add79610e50a723c809f54f4fa5d8c479d3710d6c4a0ddce2fb29997d56" Oct 10 08:00:04 crc kubenswrapper[4732]: I1010 08:00:04.335377 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx" Oct 10 08:00:04 crc kubenswrapper[4732]: I1010 08:00:04.426307 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj"] Oct 10 08:00:04 crc kubenswrapper[4732]: I1010 08:00:04.432772 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334675-jdlfj"] Oct 10 08:00:05 crc kubenswrapper[4732]: I1010 08:00:05.677512 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e4a69df-b47c-4d0f-b438-11b3be02eabb" path="/var/lib/kubelet/pods/0e4a69df-b47c-4d0f-b438-11b3be02eabb/volumes" Oct 10 08:00:24 crc kubenswrapper[4732]: I1010 08:00:24.347088 4732 scope.go:117] "RemoveContainer" containerID="ed2c4ae9b06385d150392f311eae71369d6320f51e7558c8214bfde52c736ec5" Oct 10 08:00:55 crc kubenswrapper[4732]: I1010 08:00:55.356640 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:00:55 crc kubenswrapper[4732]: I1010 08:00:55.357450 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:01:25 crc kubenswrapper[4732]: I1010 08:01:25.356623 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:01:25 crc kubenswrapper[4732]: I1010 08:01:25.357603 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:01:55 crc kubenswrapper[4732]: I1010 08:01:55.356095 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:01:55 crc kubenswrapper[4732]: I1010 08:01:55.356948 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:01:55 crc kubenswrapper[4732]: I1010 08:01:55.357020 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:01:55 crc kubenswrapper[4732]: I1010 08:01:55.358025 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:01:55 crc kubenswrapper[4732]: I1010 08:01:55.358136 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" gracePeriod=600 Oct 10 08:01:55 crc kubenswrapper[4732]: E1010 08:01:55.494172 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:01:56 crc kubenswrapper[4732]: I1010 08:01:56.443579 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" exitCode=0 Oct 10 08:01:56 crc kubenswrapper[4732]: I1010 08:01:56.443746 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629"} Oct 10 08:01:56 crc kubenswrapper[4732]: I1010 08:01:56.444097 4732 scope.go:117] "RemoveContainer" containerID="2c1cbee0cabb0dc89f4c124bf1a1fdd99fa47b775da4a9e795f4e601c5c7d64a" Oct 10 08:01:56 crc kubenswrapper[4732]: I1010 08:01:56.447235 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:01:56 crc kubenswrapper[4732]: E1010 08:01:56.448742 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:02:09 crc kubenswrapper[4732]: I1010 08:02:09.660856 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:02:09 crc kubenswrapper[4732]: E1010 08:02:09.662171 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:02:24 crc kubenswrapper[4732]: I1010 08:02:24.660773 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:02:24 crc kubenswrapper[4732]: E1010 08:02:24.661632 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:02:36 crc kubenswrapper[4732]: I1010 08:02:36.659831 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:02:36 crc kubenswrapper[4732]: E1010 08:02:36.660906 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:02:49 crc kubenswrapper[4732]: I1010 08:02:49.660798 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:02:49 crc kubenswrapper[4732]: E1010 08:02:49.661679 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:03:03 crc kubenswrapper[4732]: I1010 08:03:03.676973 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:03:03 crc kubenswrapper[4732]: E1010 08:03:03.679128 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:03:14 crc kubenswrapper[4732]: I1010 08:03:14.660351 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:03:14 crc kubenswrapper[4732]: E1010 08:03:14.661420 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.579594 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b5d29"] Oct 10 08:03:20 crc kubenswrapper[4732]: E1010 08:03:20.580349 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03988de-67cd-458f-98e5-8913168773f4" containerName="collect-profiles" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.580367 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03988de-67cd-458f-98e5-8913168773f4" containerName="collect-profiles" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.580565 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03988de-67cd-458f-98e5-8913168773f4" containerName="collect-profiles" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.585131 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.593483 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5d29"] Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.740537 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7tk\" (UniqueName: \"kubernetes.io/projected/4e84888c-c04f-40e8-b37c-bed6b2628cff-kube-api-access-kt7tk\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.741020 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-catalog-content\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.741081 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-utilities\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.842230 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7tk\" (UniqueName: \"kubernetes.io/projected/4e84888c-c04f-40e8-b37c-bed6b2628cff-kube-api-access-kt7tk\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.842356 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-catalog-content\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.842406 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-utilities\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.842907 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-catalog-content\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.843008 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-utilities\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.870882 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7tk\" (UniqueName: \"kubernetes.io/projected/4e84888c-c04f-40e8-b37c-bed6b2628cff-kube-api-access-kt7tk\") pod \"community-operators-b5d29\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:20 crc kubenswrapper[4732]: I1010 08:03:20.953844 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.179752 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m84lc"] Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.181573 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.196813 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m84lc"] Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.255818 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b7g5\" (UniqueName: \"kubernetes.io/projected/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-kube-api-access-7b7g5\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.255901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-utilities\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.255941 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-catalog-content\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.356777 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-catalog-content\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.356874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b7g5\" (UniqueName: \"kubernetes.io/projected/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-kube-api-access-7b7g5\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.356924 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-utilities\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.357370 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-catalog-content\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.357387 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-utilities\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.382681 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b7g5\" (UniqueName: \"kubernetes.io/projected/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-kube-api-access-7b7g5\") pod \"redhat-operators-m84lc\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.427837 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b5d29"] Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.509207 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:21 crc kubenswrapper[4732]: I1010 08:03:21.930974 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m84lc"] Oct 10 08:03:22 crc kubenswrapper[4732]: I1010 08:03:22.238645 4732 generic.go:334] "Generic (PLEG): container finished" podID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerID="1758464dda8513259c719998cd31892cf7663ee1477126809ba0f2ffb025ed62" exitCode=0 Oct 10 08:03:22 crc kubenswrapper[4732]: I1010 08:03:22.238688 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m84lc" event={"ID":"256dc7f3-ecb5-4f7a-98e7-89ff022cda06","Type":"ContainerDied","Data":"1758464dda8513259c719998cd31892cf7663ee1477126809ba0f2ffb025ed62"} Oct 10 08:03:22 crc kubenswrapper[4732]: I1010 08:03:22.238741 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m84lc" event={"ID":"256dc7f3-ecb5-4f7a-98e7-89ff022cda06","Type":"ContainerStarted","Data":"e930d24c0791dc6acaa3892c0df93b32fc2546991cb27a6018aceb1297b8f7c5"} Oct 10 08:03:22 crc kubenswrapper[4732]: I1010 08:03:22.241747 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerID="3769ebca370b8ff685d67b5895aff0904d1beffd9c38fe84cfee7cdc6c0f7020" exitCode=0 Oct 10 08:03:22 crc kubenswrapper[4732]: I1010 08:03:22.241789 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5d29" event={"ID":"4e84888c-c04f-40e8-b37c-bed6b2628cff","Type":"ContainerDied","Data":"3769ebca370b8ff685d67b5895aff0904d1beffd9c38fe84cfee7cdc6c0f7020"} Oct 10 08:03:22 crc kubenswrapper[4732]: I1010 08:03:22.241818 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5d29" event={"ID":"4e84888c-c04f-40e8-b37c-bed6b2628cff","Type":"ContainerStarted","Data":"f4493872a13a3388b01cb4991dc526446241604e60c7f5460bc2a9c8b3f26c7d"} Oct 10 08:03:23 crc kubenswrapper[4732]: I1010 08:03:23.254672 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m84lc" event={"ID":"256dc7f3-ecb5-4f7a-98e7-89ff022cda06","Type":"ContainerStarted","Data":"17cd614c6c12a8891ad51dadf74a21d73db088a9638c55e4b71f5a1dcdd26e8f"} Oct 10 08:03:24 crc kubenswrapper[4732]: I1010 08:03:24.267040 4732 generic.go:334] "Generic (PLEG): container finished" podID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerID="17cd614c6c12a8891ad51dadf74a21d73db088a9638c55e4b71f5a1dcdd26e8f" exitCode=0 Oct 10 08:03:24 crc kubenswrapper[4732]: I1010 08:03:24.267101 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m84lc" event={"ID":"256dc7f3-ecb5-4f7a-98e7-89ff022cda06","Type":"ContainerDied","Data":"17cd614c6c12a8891ad51dadf74a21d73db088a9638c55e4b71f5a1dcdd26e8f"} Oct 10 08:03:24 crc kubenswrapper[4732]: I1010 08:03:24.270582 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:03:24 crc kubenswrapper[4732]: I1010 08:03:24.274165 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerID="324ffba94e5e2016c542bc5d15ead398ce0b50f3e70b525c6a01f4b4b21c93a5" exitCode=0 Oct 10 08:03:24 crc kubenswrapper[4732]: I1010 08:03:24.274193 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5d29" event={"ID":"4e84888c-c04f-40e8-b37c-bed6b2628cff","Type":"ContainerDied","Data":"324ffba94e5e2016c542bc5d15ead398ce0b50f3e70b525c6a01f4b4b21c93a5"} Oct 10 08:03:25 crc kubenswrapper[4732]: I1010 08:03:25.286933 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5d29" event={"ID":"4e84888c-c04f-40e8-b37c-bed6b2628cff","Type":"ContainerStarted","Data":"bf6e57e4c2138b9c81610d8539c11289786995345a1a897bd1777ac0d5bd0f5c"} Oct 10 08:03:25 crc kubenswrapper[4732]: I1010 08:03:25.329850 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b5d29" podStartSLOduration=2.679245752 podStartE2EDuration="5.329821154s" podCreationTimestamp="2025-10-10 08:03:20 +0000 UTC" firstStartedPulling="2025-10-10 08:03:22.243547414 +0000 UTC m=+4329.313138665" lastFinishedPulling="2025-10-10 08:03:24.894122786 +0000 UTC m=+4331.963714067" observedRunningTime="2025-10-10 08:03:25.314938366 +0000 UTC m=+4332.384529697" watchObservedRunningTime="2025-10-10 08:03:25.329821154 +0000 UTC m=+4332.399412435" Oct 10 08:03:26 crc kubenswrapper[4732]: I1010 08:03:26.298191 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m84lc" event={"ID":"256dc7f3-ecb5-4f7a-98e7-89ff022cda06","Type":"ContainerStarted","Data":"fceb050fc989f5bad087d535187a3ae2a64faded3137e6f1b1c734c01555fd3d"} Oct 10 08:03:26 crc kubenswrapper[4732]: I1010 08:03:26.327676 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m84lc" podStartSLOduration=2.467572919 podStartE2EDuration="5.327643927s" podCreationTimestamp="2025-10-10 08:03:21 +0000 UTC" firstStartedPulling="2025-10-10 08:03:22.240195554 +0000 UTC m=+4329.309786805" lastFinishedPulling="2025-10-10 08:03:25.100266542 +0000 UTC m=+4332.169857813" observedRunningTime="2025-10-10 08:03:26.321483873 +0000 UTC m=+4333.391075154" watchObservedRunningTime="2025-10-10 08:03:26.327643927 +0000 UTC m=+4333.397235208" Oct 10 08:03:29 crc kubenswrapper[4732]: I1010 08:03:29.660680 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:03:29 crc kubenswrapper[4732]: E1010 08:03:29.661237 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:03:30 crc kubenswrapper[4732]: I1010 08:03:30.954303 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:30 crc kubenswrapper[4732]: I1010 08:03:30.954758 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:31 crc kubenswrapper[4732]: I1010 08:03:31.021825 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:31 crc kubenswrapper[4732]: I1010 08:03:31.414393 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:31 crc kubenswrapper[4732]: I1010 08:03:31.470517 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5d29"] Oct 10 08:03:31 crc kubenswrapper[4732]: I1010 08:03:31.509531 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:31 crc kubenswrapper[4732]: I1010 08:03:31.509581 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:31 crc kubenswrapper[4732]: I1010 08:03:31.571754 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:32 crc kubenswrapper[4732]: I1010 08:03:32.429444 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:33 crc kubenswrapper[4732]: I1010 08:03:33.358018 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b5d29" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="registry-server" containerID="cri-o://bf6e57e4c2138b9c81610d8539c11289786995345a1a897bd1777ac0d5bd0f5c" gracePeriod=2 Oct 10 08:03:33 crc kubenswrapper[4732]: I1010 08:03:33.679787 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m84lc"] Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.372938 4732 generic.go:334] "Generic (PLEG): container finished" podID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerID="bf6e57e4c2138b9c81610d8539c11289786995345a1a897bd1777ac0d5bd0f5c" exitCode=0 Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.373092 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5d29" event={"ID":"4e84888c-c04f-40e8-b37c-bed6b2628cff","Type":"ContainerDied","Data":"bf6e57e4c2138b9c81610d8539c11289786995345a1a897bd1777ac0d5bd0f5c"} Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.373126 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b5d29" event={"ID":"4e84888c-c04f-40e8-b37c-bed6b2628cff","Type":"ContainerDied","Data":"f4493872a13a3388b01cb4991dc526446241604e60c7f5460bc2a9c8b3f26c7d"} Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.373136 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4493872a13a3388b01cb4991dc526446241604e60c7f5460bc2a9c8b3f26c7d" Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.373308 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m84lc" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="registry-server" containerID="cri-o://fceb050fc989f5bad087d535187a3ae2a64faded3137e6f1b1c734c01555fd3d" gracePeriod=2 Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.390916 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.480809 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-catalog-content\") pod \"4e84888c-c04f-40e8-b37c-bed6b2628cff\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.481161 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt7tk\" (UniqueName: \"kubernetes.io/projected/4e84888c-c04f-40e8-b37c-bed6b2628cff-kube-api-access-kt7tk\") pod \"4e84888c-c04f-40e8-b37c-bed6b2628cff\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.481311 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-utilities\") pod \"4e84888c-c04f-40e8-b37c-bed6b2628cff\" (UID: \"4e84888c-c04f-40e8-b37c-bed6b2628cff\") " Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.487096 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e84888c-c04f-40e8-b37c-bed6b2628cff-kube-api-access-kt7tk" (OuterVolumeSpecName: "kube-api-access-kt7tk") pod "4e84888c-c04f-40e8-b37c-bed6b2628cff" (UID: "4e84888c-c04f-40e8-b37c-bed6b2628cff"). InnerVolumeSpecName "kube-api-access-kt7tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.582852 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt7tk\" (UniqueName: \"kubernetes.io/projected/4e84888c-c04f-40e8-b37c-bed6b2628cff-kube-api-access-kt7tk\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.613214 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-utilities" (OuterVolumeSpecName: "utilities") pod "4e84888c-c04f-40e8-b37c-bed6b2628cff" (UID: "4e84888c-c04f-40e8-b37c-bed6b2628cff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:03:34 crc kubenswrapper[4732]: I1010 08:03:34.684642 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:35 crc kubenswrapper[4732]: I1010 08:03:35.383501 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b5d29" Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.394011 4732 generic.go:334] "Generic (PLEG): container finished" podID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerID="fceb050fc989f5bad087d535187a3ae2a64faded3137e6f1b1c734c01555fd3d" exitCode=0 Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.394147 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m84lc" event={"ID":"256dc7f3-ecb5-4f7a-98e7-89ff022cda06","Type":"ContainerDied","Data":"fceb050fc989f5bad087d535187a3ae2a64faded3137e6f1b1c734c01555fd3d"} Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.589677 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.616638 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b7g5\" (UniqueName: \"kubernetes.io/projected/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-kube-api-access-7b7g5\") pod \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.617002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-catalog-content\") pod \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.617035 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-utilities\") pod \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\" (UID: \"256dc7f3-ecb5-4f7a-98e7-89ff022cda06\") " Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.618923 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-utilities" (OuterVolumeSpecName: "utilities") pod "256dc7f3-ecb5-4f7a-98e7-89ff022cda06" (UID: "256dc7f3-ecb5-4f7a-98e7-89ff022cda06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.624878 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-kube-api-access-7b7g5" (OuterVolumeSpecName: "kube-api-access-7b7g5") pod "256dc7f3-ecb5-4f7a-98e7-89ff022cda06" (UID: "256dc7f3-ecb5-4f7a-98e7-89ff022cda06"). InnerVolumeSpecName "kube-api-access-7b7g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.719834 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:36 crc kubenswrapper[4732]: I1010 08:03:36.719870 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7b7g5\" (UniqueName: \"kubernetes.io/projected/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-kube-api-access-7b7g5\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.408059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m84lc" event={"ID":"256dc7f3-ecb5-4f7a-98e7-89ff022cda06","Type":"ContainerDied","Data":"e930d24c0791dc6acaa3892c0df93b32fc2546991cb27a6018aceb1297b8f7c5"} Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.408135 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m84lc" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.408610 4732 scope.go:117] "RemoveContainer" containerID="fceb050fc989f5bad087d535187a3ae2a64faded3137e6f1b1c734c01555fd3d" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.436597 4732 scope.go:117] "RemoveContainer" containerID="17cd614c6c12a8891ad51dadf74a21d73db088a9638c55e4b71f5a1dcdd26e8f" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.457990 4732 scope.go:117] "RemoveContainer" containerID="1758464dda8513259c719998cd31892cf7663ee1477126809ba0f2ffb025ed62" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.826624 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "256dc7f3-ecb5-4f7a-98e7-89ff022cda06" (UID: "256dc7f3-ecb5-4f7a-98e7-89ff022cda06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.838874 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/256dc7f3-ecb5-4f7a-98e7-89ff022cda06-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.850585 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e84888c-c04f-40e8-b37c-bed6b2628cff" (UID: "4e84888c-c04f-40e8-b37c-bed6b2628cff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:03:37 crc kubenswrapper[4732]: I1010 08:03:37.939873 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e84888c-c04f-40e8-b37c-bed6b2628cff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:03:38 crc kubenswrapper[4732]: I1010 08:03:38.051235 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m84lc"] Oct 10 08:03:38 crc kubenswrapper[4732]: I1010 08:03:38.061578 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m84lc"] Oct 10 08:03:38 crc kubenswrapper[4732]: I1010 08:03:38.121268 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b5d29"] Oct 10 08:03:38 crc kubenswrapper[4732]: I1010 08:03:38.131642 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b5d29"] Oct 10 08:03:39 crc kubenswrapper[4732]: I1010 08:03:39.677542 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" path="/var/lib/kubelet/pods/256dc7f3-ecb5-4f7a-98e7-89ff022cda06/volumes" Oct 10 08:03:39 crc kubenswrapper[4732]: I1010 08:03:39.680086 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" path="/var/lib/kubelet/pods/4e84888c-c04f-40e8-b37c-bed6b2628cff/volumes" Oct 10 08:03:42 crc kubenswrapper[4732]: I1010 08:03:42.660018 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:03:42 crc kubenswrapper[4732]: E1010 08:03:42.660824 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:03:56 crc kubenswrapper[4732]: I1010 08:03:56.660724 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:03:56 crc kubenswrapper[4732]: E1010 08:03:56.661626 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:04:08 crc kubenswrapper[4732]: I1010 08:04:08.660284 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:04:08 crc kubenswrapper[4732]: E1010 08:04:08.661122 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:04:23 crc kubenswrapper[4732]: I1010 08:04:23.669449 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:04:23 crc kubenswrapper[4732]: E1010 08:04:23.670575 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:04:35 crc kubenswrapper[4732]: I1010 08:04:35.660079 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:04:35 crc kubenswrapper[4732]: E1010 08:04:35.661056 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:04:49 crc kubenswrapper[4732]: I1010 08:04:49.660422 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:04:49 crc kubenswrapper[4732]: E1010 08:04:49.661702 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:05:01 crc kubenswrapper[4732]: I1010 08:05:01.661301 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:05:01 crc kubenswrapper[4732]: E1010 08:05:01.662622 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:05:16 crc kubenswrapper[4732]: I1010 08:05:16.661068 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:05:16 crc kubenswrapper[4732]: E1010 08:05:16.662775 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:05:29 crc kubenswrapper[4732]: I1010 08:05:29.661482 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:05:29 crc kubenswrapper[4732]: E1010 08:05:29.663819 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:05:43 crc kubenswrapper[4732]: I1010 08:05:43.669136 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:05:43 crc kubenswrapper[4732]: E1010 08:05:43.670516 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:05:55 crc kubenswrapper[4732]: I1010 08:05:55.661083 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:05:55 crc kubenswrapper[4732]: E1010 08:05:55.662232 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:06:07 crc kubenswrapper[4732]: I1010 08:06:07.660418 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:06:07 crc kubenswrapper[4732]: E1010 08:06:07.661454 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.713868 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qqmtc"] Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.716474 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qqmtc"] Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.858854 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-b5d8w"] Oct 10 08:06:16 crc kubenswrapper[4732]: E1010 08:06:16.859444 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="extract-content" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.859488 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="extract-content" Oct 10 08:06:16 crc kubenswrapper[4732]: E1010 08:06:16.859520 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="extract-utilities" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.859537 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="extract-utilities" Oct 10 08:06:16 crc kubenswrapper[4732]: E1010 08:06:16.859582 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="registry-server" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.859600 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="registry-server" Oct 10 08:06:16 crc kubenswrapper[4732]: E1010 08:06:16.859636 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="extract-content" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.859652 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="extract-content" Oct 10 08:06:16 crc kubenswrapper[4732]: E1010 08:06:16.859689 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="extract-utilities" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.859744 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="extract-utilities" Oct 10 08:06:16 crc kubenswrapper[4732]: E1010 08:06:16.859773 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="registry-server" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.859789 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="registry-server" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.860168 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="256dc7f3-ecb5-4f7a-98e7-89ff022cda06" containerName="registry-server" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.860247 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e84888c-c04f-40e8-b37c-bed6b2628cff" containerName="registry-server" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.861225 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.864412 4732 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-hdknk" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.865119 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.865454 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.866525 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.878771 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b5d8w"] Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.997946 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-crc-storage\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.998041 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-kube-api-access-sl6fx\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:16 crc kubenswrapper[4732]: I1010 08:06:16.998165 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-node-mnt\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.099821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-crc-storage\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.099926 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-kube-api-access-sl6fx\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.100016 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-node-mnt\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.100371 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-node-mnt\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.101751 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-crc-storage\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.128116 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-kube-api-access-sl6fx\") pod \"crc-storage-crc-b5d8w\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.198535 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.675153 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928bd8d4-82cc-4c7c-8d05-342e3e4b13db" path="/var/lib/kubelet/pods/928bd8d4-82cc-4c7c-8d05-342e3e4b13db/volumes" Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.737418 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-b5d8w"] Oct 10 08:06:17 crc kubenswrapper[4732]: I1010 08:06:17.941639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b5d8w" event={"ID":"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f","Type":"ContainerStarted","Data":"a4a4fa5bec6a9875bdb8122d94d65e8a2d0e718d3a2d42402d8002dcd86892f6"} Oct 10 08:06:18 crc kubenswrapper[4732]: I1010 08:06:18.952323 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b5d8w" event={"ID":"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f","Type":"ContainerStarted","Data":"781dd58a85ce736f53bc98a1bceb9500f57c71f032f9178b9dd187cf9d626c91"} Oct 10 08:06:18 crc kubenswrapper[4732]: I1010 08:06:18.969531 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-b5d8w" podStartSLOduration=2.155859992 podStartE2EDuration="2.969507206s" podCreationTimestamp="2025-10-10 08:06:16 +0000 UTC" firstStartedPulling="2025-10-10 08:06:17.749409075 +0000 UTC m=+4504.819000346" lastFinishedPulling="2025-10-10 08:06:18.563056309 +0000 UTC m=+4505.632647560" observedRunningTime="2025-10-10 08:06:18.968261593 +0000 UTC m=+4506.037852864" watchObservedRunningTime="2025-10-10 08:06:18.969507206 +0000 UTC m=+4506.039098487" Oct 10 08:06:19 crc kubenswrapper[4732]: I1010 08:06:19.966819 4732 generic.go:334] "Generic (PLEG): container finished" podID="1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" containerID="781dd58a85ce736f53bc98a1bceb9500f57c71f032f9178b9dd187cf9d626c91" exitCode=0 Oct 10 08:06:19 crc kubenswrapper[4732]: I1010 08:06:19.966916 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b5d8w" event={"ID":"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f","Type":"ContainerDied","Data":"781dd58a85ce736f53bc98a1bceb9500f57c71f032f9178b9dd187cf9d626c91"} Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.393290 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.570229 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-crc-storage\") pod \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.570401 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-kube-api-access-sl6fx\") pod \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.570459 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-node-mnt\") pod \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\" (UID: \"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f\") " Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.571223 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" (UID: "1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.577740 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-kube-api-access-sl6fx" (OuterVolumeSpecName: "kube-api-access-sl6fx") pod "1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" (UID: "1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f"). InnerVolumeSpecName "kube-api-access-sl6fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.591233 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" (UID: "1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.660062 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:06:21 crc kubenswrapper[4732]: E1010 08:06:21.660414 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.672633 4732 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.672719 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl6fx\" (UniqueName: \"kubernetes.io/projected/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-kube-api-access-sl6fx\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.672739 4732 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.994003 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-b5d8w" event={"ID":"1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f","Type":"ContainerDied","Data":"a4a4fa5bec6a9875bdb8122d94d65e8a2d0e718d3a2d42402d8002dcd86892f6"} Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.994061 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a4fa5bec6a9875bdb8122d94d65e8a2d0e718d3a2d42402d8002dcd86892f6" Oct 10 08:06:21 crc kubenswrapper[4732]: I1010 08:06:21.994133 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-b5d8w" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.357181 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-b5d8w"] Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.368438 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-b5d8w"] Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.489240 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-6p6p4"] Oct 10 08:06:23 crc kubenswrapper[4732]: E1010 08:06:23.489791 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" containerName="storage" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.489820 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" containerName="storage" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.490104 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" containerName="storage" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.491056 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.496087 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.496109 4732 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-hdknk" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.496155 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.496308 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.499550 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6p6p4"] Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.605954 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9406de36-6825-40ee-9f0c-e4f69414213b-crc-storage\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.606033 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9406de36-6825-40ee-9f0c-e4f69414213b-node-mnt\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.606193 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f457b\" (UniqueName: \"kubernetes.io/projected/9406de36-6825-40ee-9f0c-e4f69414213b-kube-api-access-f457b\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.676169 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f" path="/var/lib/kubelet/pods/1dd42f72-6c8c-4c66-b4e0-c9c6b159ce8f/volumes" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.707979 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f457b\" (UniqueName: \"kubernetes.io/projected/9406de36-6825-40ee-9f0c-e4f69414213b-kube-api-access-f457b\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.708121 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9406de36-6825-40ee-9f0c-e4f69414213b-crc-storage\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.708153 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9406de36-6825-40ee-9f0c-e4f69414213b-node-mnt\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.708507 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9406de36-6825-40ee-9f0c-e4f69414213b-node-mnt\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.709583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9406de36-6825-40ee-9f0c-e4f69414213b-crc-storage\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.736255 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f457b\" (UniqueName: \"kubernetes.io/projected/9406de36-6825-40ee-9f0c-e4f69414213b-kube-api-access-f457b\") pod \"crc-storage-crc-6p6p4\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:23 crc kubenswrapper[4732]: I1010 08:06:23.846613 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:24 crc kubenswrapper[4732]: I1010 08:06:24.177359 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-6p6p4"] Oct 10 08:06:24 crc kubenswrapper[4732]: I1010 08:06:24.556808 4732 scope.go:117] "RemoveContainer" containerID="e76947e3580479b5fbc545e5760973d36044777457306df7cc25f18c02d47e43" Oct 10 08:06:25 crc kubenswrapper[4732]: I1010 08:06:25.038223 4732 generic.go:334] "Generic (PLEG): container finished" podID="9406de36-6825-40ee-9f0c-e4f69414213b" containerID="7db1ea5dc49feff72e1c2f64b96037d496cbea7358d4b6706cf0f5c206fc3346" exitCode=0 Oct 10 08:06:25 crc kubenswrapper[4732]: I1010 08:06:25.038472 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6p6p4" event={"ID":"9406de36-6825-40ee-9f0c-e4f69414213b","Type":"ContainerDied","Data":"7db1ea5dc49feff72e1c2f64b96037d496cbea7358d4b6706cf0f5c206fc3346"} Oct 10 08:06:25 crc kubenswrapper[4732]: I1010 08:06:25.038883 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6p6p4" event={"ID":"9406de36-6825-40ee-9f0c-e4f69414213b","Type":"ContainerStarted","Data":"df1716d71d5f206ac2a92dd468da56b5846814d1e1fbdb15bd6f0847567cc40a"} Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.356866 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.555806 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9406de36-6825-40ee-9f0c-e4f69414213b-node-mnt\") pod \"9406de36-6825-40ee-9f0c-e4f69414213b\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.555903 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9406de36-6825-40ee-9f0c-e4f69414213b-crc-storage\") pod \"9406de36-6825-40ee-9f0c-e4f69414213b\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.555948 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f457b\" (UniqueName: \"kubernetes.io/projected/9406de36-6825-40ee-9f0c-e4f69414213b-kube-api-access-f457b\") pod \"9406de36-6825-40ee-9f0c-e4f69414213b\" (UID: \"9406de36-6825-40ee-9f0c-e4f69414213b\") " Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.555975 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9406de36-6825-40ee-9f0c-e4f69414213b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9406de36-6825-40ee-9f0c-e4f69414213b" (UID: "9406de36-6825-40ee-9f0c-e4f69414213b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.556531 4732 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9406de36-6825-40ee-9f0c-e4f69414213b-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.564124 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9406de36-6825-40ee-9f0c-e4f69414213b-kube-api-access-f457b" (OuterVolumeSpecName: "kube-api-access-f457b") pod "9406de36-6825-40ee-9f0c-e4f69414213b" (UID: "9406de36-6825-40ee-9f0c-e4f69414213b"). InnerVolumeSpecName "kube-api-access-f457b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.599603 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9406de36-6825-40ee-9f0c-e4f69414213b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9406de36-6825-40ee-9f0c-e4f69414213b" (UID: "9406de36-6825-40ee-9f0c-e4f69414213b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.657732 4732 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9406de36-6825-40ee-9f0c-e4f69414213b-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:26 crc kubenswrapper[4732]: I1010 08:06:26.657825 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f457b\" (UniqueName: \"kubernetes.io/projected/9406de36-6825-40ee-9f0c-e4f69414213b-kube-api-access-f457b\") on node \"crc\" DevicePath \"\"" Oct 10 08:06:27 crc kubenswrapper[4732]: I1010 08:06:27.060231 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-6p6p4" event={"ID":"9406de36-6825-40ee-9f0c-e4f69414213b","Type":"ContainerDied","Data":"df1716d71d5f206ac2a92dd468da56b5846814d1e1fbdb15bd6f0847567cc40a"} Oct 10 08:06:27 crc kubenswrapper[4732]: I1010 08:06:27.060295 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df1716d71d5f206ac2a92dd468da56b5846814d1e1fbdb15bd6f0847567cc40a" Oct 10 08:06:27 crc kubenswrapper[4732]: I1010 08:06:27.060316 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-6p6p4" Oct 10 08:06:34 crc kubenswrapper[4732]: I1010 08:06:34.660995 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:06:34 crc kubenswrapper[4732]: E1010 08:06:34.662041 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:06:49 crc kubenswrapper[4732]: I1010 08:06:49.660996 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:06:49 crc kubenswrapper[4732]: E1010 08:06:49.662043 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:07:03 crc kubenswrapper[4732]: I1010 08:07:03.668677 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:07:04 crc kubenswrapper[4732]: I1010 08:07:04.460041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"414b67626caa2a3a0941b29c620d3e844d3d0e2537b4a836117644eb6baed081"} Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.942736 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54456b96dc-4l4xv"] Oct 10 08:08:30 crc kubenswrapper[4732]: E1010 08:08:30.943460 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9406de36-6825-40ee-9f0c-e4f69414213b" containerName="storage" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.943472 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9406de36-6825-40ee-9f0c-e4f69414213b" containerName="storage" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.943603 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9406de36-6825-40ee-9f0c-e4f69414213b" containerName="storage" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.944337 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.950413 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.950559 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.950736 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8dvdc" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.950835 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.961189 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9fbdfc9dc-vwzm4"] Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.962335 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.966897 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.971853 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54456b96dc-4l4xv"] Oct 10 08:08:30 crc kubenswrapper[4732]: I1010 08:08:30.982135 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fbdfc9dc-vwzm4"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.124971 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdjr\" (UniqueName: \"kubernetes.io/projected/65ac87e4-6966-4ccc-84e8-ac814d203792-kube-api-access-pfdjr\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.125286 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dxdx\" (UniqueName: \"kubernetes.io/projected/17434694-200c-4f66-9a6d-9b2a05e63bc7-kube-api-access-5dxdx\") pod \"dnsmasq-dns-54456b96dc-4l4xv\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.125322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-config\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.125344 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-dns-svc\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.125366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17434694-200c-4f66-9a6d-9b2a05e63bc7-config\") pod \"dnsmasq-dns-54456b96dc-4l4xv\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.227000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdjr\" (UniqueName: \"kubernetes.io/projected/65ac87e4-6966-4ccc-84e8-ac814d203792-kube-api-access-pfdjr\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.227065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dxdx\" (UniqueName: \"kubernetes.io/projected/17434694-200c-4f66-9a6d-9b2a05e63bc7-kube-api-access-5dxdx\") pod \"dnsmasq-dns-54456b96dc-4l4xv\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.227104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-config\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.227128 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-dns-svc\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.227157 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17434694-200c-4f66-9a6d-9b2a05e63bc7-config\") pod \"dnsmasq-dns-54456b96dc-4l4xv\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.228243 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17434694-200c-4f66-9a6d-9b2a05e63bc7-config\") pod \"dnsmasq-dns-54456b96dc-4l4xv\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.228427 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-config\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.228639 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-dns-svc\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.256004 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdjr\" (UniqueName: \"kubernetes.io/projected/65ac87e4-6966-4ccc-84e8-ac814d203792-kube-api-access-pfdjr\") pod \"dnsmasq-dns-9fbdfc9dc-vwzm4\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.262028 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dxdx\" (UniqueName: \"kubernetes.io/projected/17434694-200c-4f66-9a6d-9b2a05e63bc7-kube-api-access-5dxdx\") pod \"dnsmasq-dns-54456b96dc-4l4xv\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.296748 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.307684 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.327725 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54456b96dc-4l4xv"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.347805 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6db99cf779-7nsr9"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.357603 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db99cf779-7nsr9"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.357748 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.540427 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-config\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.540501 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-dns-svc\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.540523 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-646r6\" (UniqueName: \"kubernetes.io/projected/85bdd98f-c6f6-4ef2-83b5-8793059cad85-kube-api-access-646r6\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.600865 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fbdfc9dc-vwzm4"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.623436 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d4fc659cc-8k8nj"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.624540 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.641583 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-config\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.641673 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-dns-svc\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.641709 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-646r6\" (UniqueName: \"kubernetes.io/projected/85bdd98f-c6f6-4ef2-83b5-8793059cad85-kube-api-access-646r6\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.642862 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-config\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.642879 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-dns-svc\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.650100 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4fc659cc-8k8nj"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.664726 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-646r6\" (UniqueName: \"kubernetes.io/projected/85bdd98f-c6f6-4ef2-83b5-8793059cad85-kube-api-access-646r6\") pod \"dnsmasq-dns-6db99cf779-7nsr9\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.701596 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.751196 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77zx2\" (UniqueName: \"kubernetes.io/projected/01731ae6-0e0d-4b75-834a-9ded3d8718d3-kube-api-access-77zx2\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.751277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-dns-svc\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.751307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-config\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.843193 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54456b96dc-4l4xv"] Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.854242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77zx2\" (UniqueName: \"kubernetes.io/projected/01731ae6-0e0d-4b75-834a-9ded3d8718d3-kube-api-access-77zx2\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.854338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-dns-svc\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.854372 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-config\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.855522 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-config\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.856184 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-dns-svc\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.857174 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.875825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77zx2\" (UniqueName: \"kubernetes.io/projected/01731ae6-0e0d-4b75-834a-9ded3d8718d3-kube-api-access-77zx2\") pod \"dnsmasq-dns-6d4fc659cc-8k8nj\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.953282 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.966566 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fbdfc9dc-vwzm4"] Oct 10 08:08:31 crc kubenswrapper[4732]: W1010 08:08:31.980364 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ac87e4_6966_4ccc_84e8_ac814d203792.slice/crio-632c5af6cbfde8e91636e8bd007bb0a8cfb74116305344e88831697168515270 WatchSource:0}: Error finding container 632c5af6cbfde8e91636e8bd007bb0a8cfb74116305344e88831697168515270: Status 404 returned error can't find the container with id 632c5af6cbfde8e91636e8bd007bb0a8cfb74116305344e88831697168515270 Oct 10 08:08:31 crc kubenswrapper[4732]: I1010 08:08:31.996799 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6db99cf779-7nsr9"] Oct 10 08:08:32 crc kubenswrapper[4732]: W1010 08:08:32.002347 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85bdd98f_c6f6_4ef2_83b5_8793059cad85.slice/crio-c800431b513742ec151a252f75464509a0cfd18ba0947deb5ec91fdc1b5d758b WatchSource:0}: Error finding container c800431b513742ec151a252f75464509a0cfd18ba0947deb5ec91fdc1b5d758b: Status 404 returned error can't find the container with id c800431b513742ec151a252f75464509a0cfd18ba0947deb5ec91fdc1b5d758b Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.297766 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" event={"ID":"65ac87e4-6966-4ccc-84e8-ac814d203792","Type":"ContainerStarted","Data":"632c5af6cbfde8e91636e8bd007bb0a8cfb74116305344e88831697168515270"} Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.299048 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" event={"ID":"17434694-200c-4f66-9a6d-9b2a05e63bc7","Type":"ContainerStarted","Data":"43b9ea252d16205a5eb839a08719cd4a463daa335800d76a3aff20e5d73064d8"} Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.300671 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" event={"ID":"85bdd98f-c6f6-4ef2-83b5-8793059cad85","Type":"ContainerStarted","Data":"c800431b513742ec151a252f75464509a0cfd18ba0947deb5ec91fdc1b5d758b"} Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.387664 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d4fc659cc-8k8nj"] Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.469371 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.471183 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.473841 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.473887 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.474968 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.475017 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.475196 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.475457 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.475854 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-58gq6" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.482679 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.668536 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-config-data\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.668839 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25efdbbb-72bc-423d-a78a-d623e1bd6627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.668872 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.668903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.668925 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25efdbbb-72bc-423d-a78a-d623e1bd6627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.668946 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.668970 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.669014 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqgsv\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-kube-api-access-hqgsv\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.669034 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.669075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.669218 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.748577 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.749718 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.751392 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.751725 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.751920 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.752058 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.752422 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.752604 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8vtq6" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.756428 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.766808 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.770136 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.770173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.770207 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-config-data\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773288 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25efdbbb-72bc-423d-a78a-d623e1bd6627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773327 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773357 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773380 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25efdbbb-72bc-423d-a78a-d623e1bd6627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773394 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773415 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773457 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqgsv\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-kube-api-access-hqgsv\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.773473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.774269 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.776031 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.776553 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-config-data\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.778414 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.779504 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.784843 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.785633 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.788470 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25efdbbb-72bc-423d-a78a-d623e1bd6627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.790412 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25efdbbb-72bc-423d-a78a-d623e1bd6627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.795681 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.795726 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/95d940685356a8fa009428f47b914765f2e8f1c1e9fd807253f68cbe91376583/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.800228 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqgsv\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-kube-api-access-hqgsv\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.832881 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " pod="openstack/rabbitmq-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874630 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09d5a68f-9c78-46c7-9291-7e3dfad23f93-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874668 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874702 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpltl\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-kube-api-access-bpltl\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874738 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874758 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09d5a68f-9c78-46c7-9291-7e3dfad23f93-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874815 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874830 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874938 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.874964 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.875002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976278 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09d5a68f-9c78-46c7-9291-7e3dfad23f93-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976327 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976400 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976428 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976444 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976480 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09d5a68f-9c78-46c7-9291-7e3dfad23f93-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976524 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976540 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpltl\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-kube-api-access-bpltl\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.976577 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.977722 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.978301 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.978483 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.978509 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e59a483b765e13d49cfb3268cec746d1de826e8104c888b4671a6933e93acf8/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.978524 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.978751 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.979179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.981179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.981231 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.982618 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09d5a68f-9c78-46c7-9291-7e3dfad23f93-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:32 crc kubenswrapper[4732]: I1010 08:08:32.984445 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09d5a68f-9c78-46c7-9291-7e3dfad23f93-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:32.994350 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpltl\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-kube-api-access-bpltl\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.013807 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.070240 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.092373 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.310606 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" event={"ID":"01731ae6-0e0d-4b75-834a-9ded3d8718d3","Type":"ContainerStarted","Data":"9ee0599ae58670937a2f212206a5a62c0373d1856740c30d1355e98fc10022d6"} Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.535253 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:08:33 crc kubenswrapper[4732]: W1010 08:08:33.546120 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d5a68f_9c78_46c7_9291_7e3dfad23f93.slice/crio-f7af1d44b5d351b4a31b419a93034a1889968f2714e265a1686e3498bd7cbc01 WatchSource:0}: Error finding container f7af1d44b5d351b4a31b419a93034a1889968f2714e265a1686e3498bd7cbc01: Status 404 returned error can't find the container with id f7af1d44b5d351b4a31b419a93034a1889968f2714e265a1686e3498bd7cbc01 Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.608549 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.911604 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.917961 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.920350 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-6jxbq" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.920518 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.920655 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.920825 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.922414 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.934107 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 08:08:33 crc kubenswrapper[4732]: I1010 08:08:33.941810 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114209 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-kolla-config\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114553 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114590 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114621 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-config-data-default\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114643 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cn48\" (UniqueName: \"kubernetes.io/projected/bf925f45-1e6a-41dc-bc76-c554a0a21636-kube-api-access-4cn48\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-secrets\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114722 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114758 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.114794 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf925f45-1e6a-41dc-bc76-c554a0a21636-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216419 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-config-data-default\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216478 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cn48\" (UniqueName: \"kubernetes.io/projected/bf925f45-1e6a-41dc-bc76-c554a0a21636-kube-api-access-4cn48\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216502 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-secrets\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216565 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf925f45-1e6a-41dc-bc76-c554a0a21636-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216602 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-kolla-config\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216629 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.216659 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.218118 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-kolla-config\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.218405 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bf925f45-1e6a-41dc-bc76-c554a0a21636-config-data-generated\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.218452 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-config-data-default\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.219774 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf925f45-1e6a-41dc-bc76-c554a0a21636-operator-scripts\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.219897 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.219920 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4ed3b1fc6c3e16a91ce028af190e8aa6eee2dfed0134194f877a1c42248b8862/globalmount\"" pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.222768 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-secrets\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.223048 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.224415 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf925f45-1e6a-41dc-bc76-c554a0a21636-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.236532 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cn48\" (UniqueName: \"kubernetes.io/projected/bf925f45-1e6a-41dc-bc76-c554a0a21636-kube-api-access-4cn48\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.243997 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-99ba7f27-4403-4f90-9a5f-576cb1f408e9\") pod \"openstack-galera-0\" (UID: \"bf925f45-1e6a-41dc-bc76-c554a0a21636\") " pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.255705 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.322839 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"25efdbbb-72bc-423d-a78a-d623e1bd6627","Type":"ContainerStarted","Data":"4405eeb21eb6e193ae59d7fddd870fa91bdc2d78d402be2a3b14d76408ced92c"} Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.324561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09d5a68f-9c78-46c7-9291-7e3dfad23f93","Type":"ContainerStarted","Data":"f7af1d44b5d351b4a31b419a93034a1889968f2714e265a1686e3498bd7cbc01"} Oct 10 08:08:34 crc kubenswrapper[4732]: I1010 08:08:34.786192 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 10 08:08:34 crc kubenswrapper[4732]: W1010 08:08:34.791501 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf925f45_1e6a_41dc_bc76_c554a0a21636.slice/crio-b98c2021aee701b04ee7a961b3ef7a1abc668897c3055771e9ae689f795db85c WatchSource:0}: Error finding container b98c2021aee701b04ee7a961b3ef7a1abc668897c3055771e9ae689f795db85c: Status 404 returned error can't find the container with id b98c2021aee701b04ee7a961b3ef7a1abc668897c3055771e9ae689f795db85c Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.133527 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.134975 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.137382 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-fchkv" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.137538 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.137591 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.140456 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.147799 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232618 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j875k\" (UniqueName: \"kubernetes.io/projected/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-kube-api-access-j875k\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232656 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232675 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232726 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232750 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232772 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232790 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.232825 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d20d4680-48dc-4f84-a027-05264e5aa378\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d20d4680-48dc-4f84-a027-05264e5aa378\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.335838 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d20d4680-48dc-4f84-a027-05264e5aa378\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d20d4680-48dc-4f84-a027-05264e5aa378\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.335916 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.335948 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j875k\" (UniqueName: \"kubernetes.io/projected/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-kube-api-access-j875k\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.335994 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.336014 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.336060 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.336092 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.336134 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.336156 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.337892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.339334 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.339869 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.341270 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.341306 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d20d4680-48dc-4f84-a027-05264e5aa378\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d20d4680-48dc-4f84-a027-05264e5aa378\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4e902b00d30707ece9950fd8b968114385a128f6c592a2327a089ece2a42cc87/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.342619 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bf925f45-1e6a-41dc-bc76-c554a0a21636","Type":"ContainerStarted","Data":"b98c2021aee701b04ee7a961b3ef7a1abc668897c3055771e9ae689f795db85c"} Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.346078 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.607286 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.610861 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.610987 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.614753 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j875k\" (UniqueName: \"kubernetes.io/projected/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-kube-api-access-j875k\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.615113 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/05bbbd5f-b1ea-4a6d-9787-082d27fbfae6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.616313 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.619776 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-sljbb" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.622193 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.622381 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.630923 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.726613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d20d4680-48dc-4f84-a027-05264e5aa378\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d20d4680-48dc-4f84-a027-05264e5aa378\") pod \"openstack-cell1-galera-0\" (UID: \"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6\") " pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.742283 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f47v6\" (UniqueName: \"kubernetes.io/projected/5044a9c9-6ab0-449a-817a-904207e1dba9-kube-api-access-f47v6\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.742318 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5044a9c9-6ab0-449a-817a-904207e1dba9-kolla-config\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.742344 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5044a9c9-6ab0-449a-817a-904207e1dba9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.742397 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5044a9c9-6ab0-449a-817a-904207e1dba9-config-data\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.742429 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5044a9c9-6ab0-449a-817a-904207e1dba9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.758346 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.843770 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5044a9c9-6ab0-449a-817a-904207e1dba9-kolla-config\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.843832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5044a9c9-6ab0-449a-817a-904207e1dba9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.843907 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5044a9c9-6ab0-449a-817a-904207e1dba9-config-data\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.843942 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5044a9c9-6ab0-449a-817a-904207e1dba9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.843978 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f47v6\" (UniqueName: \"kubernetes.io/projected/5044a9c9-6ab0-449a-817a-904207e1dba9-kube-api-access-f47v6\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.847617 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5044a9c9-6ab0-449a-817a-904207e1dba9-config-data\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.847948 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5044a9c9-6ab0-449a-817a-904207e1dba9-kolla-config\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.851672 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5044a9c9-6ab0-449a-817a-904207e1dba9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.851749 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5044a9c9-6ab0-449a-817a-904207e1dba9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:35 crc kubenswrapper[4732]: I1010 08:08:35.861028 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f47v6\" (UniqueName: \"kubernetes.io/projected/5044a9c9-6ab0-449a-817a-904207e1dba9-kube-api-access-f47v6\") pod \"memcached-0\" (UID: \"5044a9c9-6ab0-449a-817a-904207e1dba9\") " pod="openstack/memcached-0" Oct 10 08:08:36 crc kubenswrapper[4732]: I1010 08:08:36.029486 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 10 08:08:36 crc kubenswrapper[4732]: I1010 08:08:36.242504 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 10 08:08:36 crc kubenswrapper[4732]: W1010 08:08:36.250794 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05bbbd5f_b1ea_4a6d_9787_082d27fbfae6.slice/crio-a7b6904c64153a69a6f82c5d80d8313f65648a5fe52302e1cfe29b30a46b808c WatchSource:0}: Error finding container a7b6904c64153a69a6f82c5d80d8313f65648a5fe52302e1cfe29b30a46b808c: Status 404 returned error can't find the container with id a7b6904c64153a69a6f82c5d80d8313f65648a5fe52302e1cfe29b30a46b808c Oct 10 08:08:36 crc kubenswrapper[4732]: I1010 08:08:36.352495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6","Type":"ContainerStarted","Data":"a7b6904c64153a69a6f82c5d80d8313f65648a5fe52302e1cfe29b30a46b808c"} Oct 10 08:08:36 crc kubenswrapper[4732]: I1010 08:08:36.486781 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 10 08:08:36 crc kubenswrapper[4732]: W1010 08:08:36.502636 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5044a9c9_6ab0_449a_817a_904207e1dba9.slice/crio-c941db190a62c08d68db7af4aff1f5db5cb524a61d95e69d7505428ffa6a1190 WatchSource:0}: Error finding container c941db190a62c08d68db7af4aff1f5db5cb524a61d95e69d7505428ffa6a1190: Status 404 returned error can't find the container with id c941db190a62c08d68db7af4aff1f5db5cb524a61d95e69d7505428ffa6a1190 Oct 10 08:08:37 crc kubenswrapper[4732]: I1010 08:08:37.363746 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5044a9c9-6ab0-449a-817a-904207e1dba9","Type":"ContainerStarted","Data":"c941db190a62c08d68db7af4aff1f5db5cb524a61d95e69d7505428ffa6a1190"} Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.372052 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5x6t7"] Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.381392 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x6t7"] Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.381543 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.441835 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-utilities\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.441896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-catalog-content\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.441917 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z85vs\" (UniqueName: \"kubernetes.io/projected/5a38037b-b925-4eaa-833d-effa7af118ba-kube-api-access-z85vs\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.543703 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-utilities\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.543776 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-catalog-content\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.543799 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z85vs\" (UniqueName: \"kubernetes.io/projected/5a38037b-b925-4eaa-833d-effa7af118ba-kube-api-access-z85vs\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.544577 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-utilities\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.544607 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-catalog-content\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.566665 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z85vs\" (UniqueName: \"kubernetes.io/projected/5a38037b-b925-4eaa-833d-effa7af118ba-kube-api-access-z85vs\") pod \"redhat-marketplace-5x6t7\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:42 crc kubenswrapper[4732]: I1010 08:08:42.715476 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:08:43 crc kubenswrapper[4732]: I1010 08:08:43.157590 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x6t7"] Oct 10 08:08:43 crc kubenswrapper[4732]: I1010 08:08:43.428807 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x6t7" event={"ID":"5a38037b-b925-4eaa-833d-effa7af118ba","Type":"ContainerStarted","Data":"8a1240ea577bfc30478d711c45b70e56a450cb72226361dfbd8b8d992a9ac9fa"} Oct 10 08:08:45 crc kubenswrapper[4732]: I1010 08:08:45.445528 4732 generic.go:334] "Generic (PLEG): container finished" podID="5a38037b-b925-4eaa-833d-effa7af118ba" containerID="cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18" exitCode=0 Oct 10 08:08:45 crc kubenswrapper[4732]: I1010 08:08:45.445666 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x6t7" event={"ID":"5a38037b-b925-4eaa-833d-effa7af118ba","Type":"ContainerDied","Data":"cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.585289 4732 generic.go:334] "Generic (PLEG): container finished" podID="65ac87e4-6966-4ccc-84e8-ac814d203792" containerID="ade17544b0a3cf5e906a7b195547dc46960d8a2dfda3e93d76dfd2877421cf68" exitCode=0 Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.585359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" event={"ID":"65ac87e4-6966-4ccc-84e8-ac814d203792","Type":"ContainerDied","Data":"ade17544b0a3cf5e906a7b195547dc46960d8a2dfda3e93d76dfd2877421cf68"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.590207 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5044a9c9-6ab0-449a-817a-904207e1dba9","Type":"ContainerStarted","Data":"9ba62bd5ef8faa77e89165ea96ebd0ea88a4559e0fb1e5876cce7d497e481729"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.590354 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.596570 4732 generic.go:334] "Generic (PLEG): container finished" podID="17434694-200c-4f66-9a6d-9b2a05e63bc7" containerID="b30a47b9b5b6205ce7361909addafaf1d320fa64d55a2561fc30beea0389c931" exitCode=0 Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.596981 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" event={"ID":"17434694-200c-4f66-9a6d-9b2a05e63bc7","Type":"ContainerDied","Data":"b30a47b9b5b6205ce7361909addafaf1d320fa64d55a2561fc30beea0389c931"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.600668 4732 generic.go:334] "Generic (PLEG): container finished" podID="5a38037b-b925-4eaa-833d-effa7af118ba" containerID="5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf" exitCode=0 Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.600830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x6t7" event={"ID":"5a38037b-b925-4eaa-833d-effa7af118ba","Type":"ContainerDied","Data":"5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.603480 4732 generic.go:334] "Generic (PLEG): container finished" podID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerID="86342c047da9cec03e966149bad5d6db6261921d197eff2e8e1609a6744b1a8c" exitCode=0 Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.603528 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" event={"ID":"85bdd98f-c6f6-4ef2-83b5-8793059cad85","Type":"ContainerDied","Data":"86342c047da9cec03e966149bad5d6db6261921d197eff2e8e1609a6744b1a8c"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.605376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bf925f45-1e6a-41dc-bc76-c554a0a21636","Type":"ContainerStarted","Data":"74241fc2d67337b59131cf0bd0f1e14334a93cb031b36aef1baf8c106de248ab"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.610815 4732 generic.go:334] "Generic (PLEG): container finished" podID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerID="196c894bc57ba3b08383681e4e1c6c75c2cb75ee81d00389895aecf75d71f317" exitCode=0 Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.610916 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" event={"ID":"01731ae6-0e0d-4b75-834a-9ded3d8718d3","Type":"ContainerDied","Data":"196c894bc57ba3b08383681e4e1c6c75c2cb75ee81d00389895aecf75d71f317"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.617104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6","Type":"ContainerStarted","Data":"43888e7cdb20253dc90c32d07aa690ad9e6f41717765b058aa4833719b0c30b2"} Oct 10 08:09:00 crc kubenswrapper[4732]: I1010 08:09:00.670356 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.95348998 podStartE2EDuration="25.670315525s" podCreationTimestamp="2025-10-10 08:08:35 +0000 UTC" firstStartedPulling="2025-10-10 08:08:36.506005141 +0000 UTC m=+4643.575596382" lastFinishedPulling="2025-10-10 08:08:59.222830666 +0000 UTC m=+4666.292421927" observedRunningTime="2025-10-10 08:09:00.660753797 +0000 UTC m=+4667.730345048" watchObservedRunningTime="2025-10-10 08:09:00.670315525 +0000 UTC m=+4667.739906766" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.024732 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.073735 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.145892 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfdjr\" (UniqueName: \"kubernetes.io/projected/65ac87e4-6966-4ccc-84e8-ac814d203792-kube-api-access-pfdjr\") pod \"65ac87e4-6966-4ccc-84e8-ac814d203792\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.145983 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17434694-200c-4f66-9a6d-9b2a05e63bc7-config\") pod \"17434694-200c-4f66-9a6d-9b2a05e63bc7\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.146018 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-dns-svc\") pod \"65ac87e4-6966-4ccc-84e8-ac814d203792\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.146064 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dxdx\" (UniqueName: \"kubernetes.io/projected/17434694-200c-4f66-9a6d-9b2a05e63bc7-kube-api-access-5dxdx\") pod \"17434694-200c-4f66-9a6d-9b2a05e63bc7\" (UID: \"17434694-200c-4f66-9a6d-9b2a05e63bc7\") " Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.146101 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-config\") pod \"65ac87e4-6966-4ccc-84e8-ac814d203792\" (UID: \"65ac87e4-6966-4ccc-84e8-ac814d203792\") " Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.150683 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ac87e4-6966-4ccc-84e8-ac814d203792-kube-api-access-pfdjr" (OuterVolumeSpecName: "kube-api-access-pfdjr") pod "65ac87e4-6966-4ccc-84e8-ac814d203792" (UID: "65ac87e4-6966-4ccc-84e8-ac814d203792"). InnerVolumeSpecName "kube-api-access-pfdjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.150997 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17434694-200c-4f66-9a6d-9b2a05e63bc7-kube-api-access-5dxdx" (OuterVolumeSpecName: "kube-api-access-5dxdx") pod "17434694-200c-4f66-9a6d-9b2a05e63bc7" (UID: "17434694-200c-4f66-9a6d-9b2a05e63bc7"). InnerVolumeSpecName "kube-api-access-5dxdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.163152 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-config" (OuterVolumeSpecName: "config") pod "65ac87e4-6966-4ccc-84e8-ac814d203792" (UID: "65ac87e4-6966-4ccc-84e8-ac814d203792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.163890 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17434694-200c-4f66-9a6d-9b2a05e63bc7-config" (OuterVolumeSpecName: "config") pod "17434694-200c-4f66-9a6d-9b2a05e63bc7" (UID: "17434694-200c-4f66-9a6d-9b2a05e63bc7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.165269 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65ac87e4-6966-4ccc-84e8-ac814d203792" (UID: "65ac87e4-6966-4ccc-84e8-ac814d203792"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.247556 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfdjr\" (UniqueName: \"kubernetes.io/projected/65ac87e4-6966-4ccc-84e8-ac814d203792-kube-api-access-pfdjr\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.247604 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17434694-200c-4f66-9a6d-9b2a05e63bc7-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.247625 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.247643 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dxdx\" (UniqueName: \"kubernetes.io/projected/17434694-200c-4f66-9a6d-9b2a05e63bc7-kube-api-access-5dxdx\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.247661 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65ac87e4-6966-4ccc-84e8-ac814d203792-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.627009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"25efdbbb-72bc-423d-a78a-d623e1bd6627","Type":"ContainerStarted","Data":"18e4ac8756d3f4bde508164449ac6730f79866e994d4a05d74cd56dcbc62842e"} Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.629659 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" event={"ID":"01731ae6-0e0d-4b75-834a-9ded3d8718d3","Type":"ContainerStarted","Data":"e767a6d58a9c5fd1799020d0852fba2c18b09ae0a642ebea7a82f66c60b9853a"} Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.629839 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.631504 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" event={"ID":"65ac87e4-6966-4ccc-84e8-ac814d203792","Type":"ContainerDied","Data":"632c5af6cbfde8e91636e8bd007bb0a8cfb74116305344e88831697168515270"} Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.631529 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fbdfc9dc-vwzm4" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.631537 4732 scope.go:117] "RemoveContainer" containerID="ade17544b0a3cf5e906a7b195547dc46960d8a2dfda3e93d76dfd2877421cf68" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.634208 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09d5a68f-9c78-46c7-9291-7e3dfad23f93","Type":"ContainerStarted","Data":"23343e4052c7f2073f3a383e220af8a0d2a8fe0cc53723404974313264c4f4a8"} Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.636469 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" event={"ID":"17434694-200c-4f66-9a6d-9b2a05e63bc7","Type":"ContainerDied","Data":"43b9ea252d16205a5eb839a08719cd4a463daa335800d76a3aff20e5d73064d8"} Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.636564 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54456b96dc-4l4xv" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.651818 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x6t7" event={"ID":"5a38037b-b925-4eaa-833d-effa7af118ba","Type":"ContainerStarted","Data":"80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15"} Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.661119 4732 scope.go:117] "RemoveContainer" containerID="b30a47b9b5b6205ce7361909addafaf1d320fa64d55a2561fc30beea0389c931" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.670904 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" event={"ID":"85bdd98f-c6f6-4ef2-83b5-8793059cad85","Type":"ContainerStarted","Data":"86164c13a71f83817c0af4e408557990e8c6a51f654d82397b955dc91aaa5e8d"} Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.670984 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.711036 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" podStartSLOduration=3.375218321 podStartE2EDuration="30.711019599s" podCreationTimestamp="2025-10-10 08:08:31 +0000 UTC" firstStartedPulling="2025-10-10 08:08:32.014129577 +0000 UTC m=+4639.083720818" lastFinishedPulling="2025-10-10 08:08:59.349930845 +0000 UTC m=+4666.419522096" observedRunningTime="2025-10-10 08:09:01.710170696 +0000 UTC m=+4668.779761947" watchObservedRunningTime="2025-10-10 08:09:01.711019599 +0000 UTC m=+4668.780610840" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.729978 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5x6t7" podStartSLOduration=6.520671366 podStartE2EDuration="19.72996297s" podCreationTimestamp="2025-10-10 08:08:42 +0000 UTC" firstStartedPulling="2025-10-10 08:08:47.920679271 +0000 UTC m=+4654.990270512" lastFinishedPulling="2025-10-10 08:09:01.129970875 +0000 UTC m=+4668.199562116" observedRunningTime="2025-10-10 08:09:01.728398728 +0000 UTC m=+4668.797989989" watchObservedRunningTime="2025-10-10 08:09:01.72996297 +0000 UTC m=+4668.799554211" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.773212 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" podStartSLOduration=3.882974977 podStartE2EDuration="30.773195386s" podCreationTimestamp="2025-10-10 08:08:31 +0000 UTC" firstStartedPulling="2025-10-10 08:08:32.397973191 +0000 UTC m=+4639.467564442" lastFinishedPulling="2025-10-10 08:08:59.2881936 +0000 UTC m=+4666.357784851" observedRunningTime="2025-10-10 08:09:01.74885816 +0000 UTC m=+4668.818449411" watchObservedRunningTime="2025-10-10 08:09:01.773195386 +0000 UTC m=+4668.842786627" Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.790441 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fbdfc9dc-vwzm4"] Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.792587 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9fbdfc9dc-vwzm4"] Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.819253 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54456b96dc-4l4xv"] Oct 10 08:09:01 crc kubenswrapper[4732]: I1010 08:09:01.827271 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54456b96dc-4l4xv"] Oct 10 08:09:02 crc kubenswrapper[4732]: I1010 08:09:02.716687 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:09:02 crc kubenswrapper[4732]: I1010 08:09:02.716785 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:09:02 crc kubenswrapper[4732]: I1010 08:09:02.766373 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:09:03 crc kubenswrapper[4732]: I1010 08:09:03.675180 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17434694-200c-4f66-9a6d-9b2a05e63bc7" path="/var/lib/kubelet/pods/17434694-200c-4f66-9a6d-9b2a05e63bc7/volumes" Oct 10 08:09:03 crc kubenswrapper[4732]: I1010 08:09:03.676747 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ac87e4-6966-4ccc-84e8-ac814d203792" path="/var/lib/kubelet/pods/65ac87e4-6966-4ccc-84e8-ac814d203792/volumes" Oct 10 08:09:04 crc kubenswrapper[4732]: I1010 08:09:04.706764 4732 generic.go:334] "Generic (PLEG): container finished" podID="05bbbd5f-b1ea-4a6d-9787-082d27fbfae6" containerID="43888e7cdb20253dc90c32d07aa690ad9e6f41717765b058aa4833719b0c30b2" exitCode=0 Oct 10 08:09:04 crc kubenswrapper[4732]: I1010 08:09:04.706901 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6","Type":"ContainerDied","Data":"43888e7cdb20253dc90c32d07aa690ad9e6f41717765b058aa4833719b0c30b2"} Oct 10 08:09:04 crc kubenswrapper[4732]: I1010 08:09:04.709568 4732 generic.go:334] "Generic (PLEG): container finished" podID="bf925f45-1e6a-41dc-bc76-c554a0a21636" containerID="74241fc2d67337b59131cf0bd0f1e14334a93cb031b36aef1baf8c106de248ab" exitCode=0 Oct 10 08:09:04 crc kubenswrapper[4732]: I1010 08:09:04.709906 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bf925f45-1e6a-41dc-bc76-c554a0a21636","Type":"ContainerDied","Data":"74241fc2d67337b59131cf0bd0f1e14334a93cb031b36aef1baf8c106de248ab"} Oct 10 08:09:05 crc kubenswrapper[4732]: I1010 08:09:05.720913 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"05bbbd5f-b1ea-4a6d-9787-082d27fbfae6","Type":"ContainerStarted","Data":"49d1e2a75e837aaea5dc9c9b600640bab4b336c60b5f535ff30b436b24444e84"} Oct 10 08:09:05 crc kubenswrapper[4732]: I1010 08:09:05.728034 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"bf925f45-1e6a-41dc-bc76-c554a0a21636","Type":"ContainerStarted","Data":"12097a9043db0355c3ca41c5e339c43ff76ad6555abc9762686d89f381b03636"} Oct 10 08:09:05 crc kubenswrapper[4732]: I1010 08:09:05.758642 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 10 08:09:05 crc kubenswrapper[4732]: I1010 08:09:05.758730 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 10 08:09:05 crc kubenswrapper[4732]: I1010 08:09:05.767178 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.69680938 podStartE2EDuration="31.767145901s" podCreationTimestamp="2025-10-10 08:08:34 +0000 UTC" firstStartedPulling="2025-10-10 08:08:36.257126528 +0000 UTC m=+4643.326717769" lastFinishedPulling="2025-10-10 08:08:59.327463049 +0000 UTC m=+4666.397054290" observedRunningTime="2025-10-10 08:09:05.759507425 +0000 UTC m=+4672.829098686" watchObservedRunningTime="2025-10-10 08:09:05.767145901 +0000 UTC m=+4672.836737142" Oct 10 08:09:05 crc kubenswrapper[4732]: I1010 08:09:05.801558 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.305025193 podStartE2EDuration="33.801534228s" podCreationTimestamp="2025-10-10 08:08:32 +0000 UTC" firstStartedPulling="2025-10-10 08:08:34.795166039 +0000 UTC m=+4641.864757280" lastFinishedPulling="2025-10-10 08:08:59.291675074 +0000 UTC m=+4666.361266315" observedRunningTime="2025-10-10 08:09:05.792222647 +0000 UTC m=+4672.861813888" watchObservedRunningTime="2025-10-10 08:09:05.801534228 +0000 UTC m=+4672.871125469" Oct 10 08:09:06 crc kubenswrapper[4732]: I1010 08:09:06.031103 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 10 08:09:06 crc kubenswrapper[4732]: I1010 08:09:06.703899 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:09:06 crc kubenswrapper[4732]: I1010 08:09:06.955868 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:09:07 crc kubenswrapper[4732]: I1010 08:09:07.017782 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db99cf779-7nsr9"] Oct 10 08:09:07 crc kubenswrapper[4732]: I1010 08:09:07.018007 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" podUID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerName="dnsmasq-dns" containerID="cri-o://86164c13a71f83817c0af4e408557990e8c6a51f654d82397b955dc91aaa5e8d" gracePeriod=10 Oct 10 08:09:07 crc kubenswrapper[4732]: I1010 08:09:07.753837 4732 generic.go:334] "Generic (PLEG): container finished" podID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerID="86164c13a71f83817c0af4e408557990e8c6a51f654d82397b955dc91aaa5e8d" exitCode=0 Oct 10 08:09:07 crc kubenswrapper[4732]: I1010 08:09:07.753907 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" event={"ID":"85bdd98f-c6f6-4ef2-83b5-8793059cad85","Type":"ContainerDied","Data":"86164c13a71f83817c0af4e408557990e8c6a51f654d82397b955dc91aaa5e8d"} Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.224221 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.261243 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-dns-svc\") pod \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.261438 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-646r6\" (UniqueName: \"kubernetes.io/projected/85bdd98f-c6f6-4ef2-83b5-8793059cad85-kube-api-access-646r6\") pod \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.261534 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-config\") pod \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\" (UID: \"85bdd98f-c6f6-4ef2-83b5-8793059cad85\") " Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.273534 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85bdd98f-c6f6-4ef2-83b5-8793059cad85-kube-api-access-646r6" (OuterVolumeSpecName: "kube-api-access-646r6") pod "85bdd98f-c6f6-4ef2-83b5-8793059cad85" (UID: "85bdd98f-c6f6-4ef2-83b5-8793059cad85"). InnerVolumeSpecName "kube-api-access-646r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.319147 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "85bdd98f-c6f6-4ef2-83b5-8793059cad85" (UID: "85bdd98f-c6f6-4ef2-83b5-8793059cad85"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.335123 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-config" (OuterVolumeSpecName: "config") pod "85bdd98f-c6f6-4ef2-83b5-8793059cad85" (UID: "85bdd98f-c6f6-4ef2-83b5-8793059cad85"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.363628 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.363949 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-646r6\" (UniqueName: \"kubernetes.io/projected/85bdd98f-c6f6-4ef2-83b5-8793059cad85-kube-api-access-646r6\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.364070 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/85bdd98f-c6f6-4ef2-83b5-8793059cad85-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.765589 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" event={"ID":"85bdd98f-c6f6-4ef2-83b5-8793059cad85","Type":"ContainerDied","Data":"c800431b513742ec151a252f75464509a0cfd18ba0947deb5ec91fdc1b5d758b"} Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.765669 4732 scope.go:117] "RemoveContainer" containerID="86164c13a71f83817c0af4e408557990e8c6a51f654d82397b955dc91aaa5e8d" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.765803 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6db99cf779-7nsr9" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.794720 4732 scope.go:117] "RemoveContainer" containerID="86342c047da9cec03e966149bad5d6db6261921d197eff2e8e1609a6744b1a8c" Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.816479 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6db99cf779-7nsr9"] Oct 10 08:09:08 crc kubenswrapper[4732]: I1010 08:09:08.822791 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6db99cf779-7nsr9"] Oct 10 08:09:09 crc kubenswrapper[4732]: I1010 08:09:09.680265 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" path="/var/lib/kubelet/pods/85bdd98f-c6f6-4ef2-83b5-8793059cad85/volumes" Oct 10 08:09:12 crc kubenswrapper[4732]: I1010 08:09:12.793028 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:09:12 crc kubenswrapper[4732]: I1010 08:09:12.871525 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x6t7"] Oct 10 08:09:12 crc kubenswrapper[4732]: I1010 08:09:12.871900 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5x6t7" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="registry-server" containerID="cri-o://80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15" gracePeriod=2 Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.402518 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.449800 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-catalog-content\") pod \"5a38037b-b925-4eaa-833d-effa7af118ba\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.449862 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-utilities\") pod \"5a38037b-b925-4eaa-833d-effa7af118ba\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.449942 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z85vs\" (UniqueName: \"kubernetes.io/projected/5a38037b-b925-4eaa-833d-effa7af118ba-kube-api-access-z85vs\") pod \"5a38037b-b925-4eaa-833d-effa7af118ba\" (UID: \"5a38037b-b925-4eaa-833d-effa7af118ba\") " Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.451166 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-utilities" (OuterVolumeSpecName: "utilities") pod "5a38037b-b925-4eaa-833d-effa7af118ba" (UID: "5a38037b-b925-4eaa-833d-effa7af118ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.456156 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a38037b-b925-4eaa-833d-effa7af118ba-kube-api-access-z85vs" (OuterVolumeSpecName: "kube-api-access-z85vs") pod "5a38037b-b925-4eaa-833d-effa7af118ba" (UID: "5a38037b-b925-4eaa-833d-effa7af118ba"). InnerVolumeSpecName "kube-api-access-z85vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.468076 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a38037b-b925-4eaa-833d-effa7af118ba" (UID: "5a38037b-b925-4eaa-833d-effa7af118ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.552264 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.552617 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a38037b-b925-4eaa-833d-effa7af118ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.552755 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z85vs\" (UniqueName: \"kubernetes.io/projected/5a38037b-b925-4eaa-833d-effa7af118ba-kube-api-access-z85vs\") on node \"crc\" DevicePath \"\"" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.828676 4732 generic.go:334] "Generic (PLEG): container finished" podID="5a38037b-b925-4eaa-833d-effa7af118ba" containerID="80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15" exitCode=0 Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.828815 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5x6t7" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.828846 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x6t7" event={"ID":"5a38037b-b925-4eaa-833d-effa7af118ba","Type":"ContainerDied","Data":"80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15"} Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.829232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5x6t7" event={"ID":"5a38037b-b925-4eaa-833d-effa7af118ba","Type":"ContainerDied","Data":"8a1240ea577bfc30478d711c45b70e56a450cb72226361dfbd8b8d992a9ac9fa"} Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.829275 4732 scope.go:117] "RemoveContainer" containerID="80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.868027 4732 scope.go:117] "RemoveContainer" containerID="5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.869324 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x6t7"] Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.881623 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5x6t7"] Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.894429 4732 scope.go:117] "RemoveContainer" containerID="cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.942323 4732 scope.go:117] "RemoveContainer" containerID="80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15" Oct 10 08:09:13 crc kubenswrapper[4732]: E1010 08:09:13.942767 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15\": container with ID starting with 80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15 not found: ID does not exist" containerID="80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.942803 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15"} err="failed to get container status \"80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15\": rpc error: code = NotFound desc = could not find container \"80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15\": container with ID starting with 80feb9741fa2c428775cc9a24cd59cd30a39e3a7950c39415994a14991c19b15 not found: ID does not exist" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.942827 4732 scope.go:117] "RemoveContainer" containerID="5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf" Oct 10 08:09:13 crc kubenswrapper[4732]: E1010 08:09:13.943083 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf\": container with ID starting with 5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf not found: ID does not exist" containerID="5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.943110 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf"} err="failed to get container status \"5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf\": rpc error: code = NotFound desc = could not find container \"5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf\": container with ID starting with 5fb39bf1dc0f02088ba1ca81136d14ce534a9055bcc61897c76445aa88415bbf not found: ID does not exist" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.943127 4732 scope.go:117] "RemoveContainer" containerID="cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18" Oct 10 08:09:13 crc kubenswrapper[4732]: E1010 08:09:13.943404 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18\": container with ID starting with cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18 not found: ID does not exist" containerID="cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18" Oct 10 08:09:13 crc kubenswrapper[4732]: I1010 08:09:13.943430 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18"} err="failed to get container status \"cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18\": rpc error: code = NotFound desc = could not find container \"cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18\": container with ID starting with cb78b2a88e9d0bc98622bb5ad77ad22fdc08665e576d1737e0d543e139beef18 not found: ID does not exist" Oct 10 08:09:14 crc kubenswrapper[4732]: I1010 08:09:14.256657 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 10 08:09:14 crc kubenswrapper[4732]: I1010 08:09:14.257022 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 10 08:09:14 crc kubenswrapper[4732]: I1010 08:09:14.312539 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 10 08:09:14 crc kubenswrapper[4732]: I1010 08:09:14.914657 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 10 08:09:15 crc kubenswrapper[4732]: I1010 08:09:15.676641 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" path="/var/lib/kubelet/pods/5a38037b-b925-4eaa-833d-effa7af118ba/volumes" Oct 10 08:09:16 crc kubenswrapper[4732]: I1010 08:09:16.042893 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 10 08:09:16 crc kubenswrapper[4732]: I1010 08:09:16.115685 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="05bbbd5f-b1ea-4a6d-9787-082d27fbfae6" containerName="galera" probeResult="failure" output=< Oct 10 08:09:16 crc kubenswrapper[4732]: wsrep_local_state_comment (Joined) differs from Synced Oct 10 08:09:16 crc kubenswrapper[4732]: > Oct 10 08:09:22 crc kubenswrapper[4732]: E1010 08:09:22.054636 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a38037b_b925_4eaa_833d_effa7af118ba.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:09:24 crc kubenswrapper[4732]: I1010 08:09:24.686049 4732 scope.go:117] "RemoveContainer" containerID="3769ebca370b8ff685d67b5895aff0904d1beffd9c38fe84cfee7cdc6c0f7020" Oct 10 08:09:24 crc kubenswrapper[4732]: I1010 08:09:24.718799 4732 scope.go:117] "RemoveContainer" containerID="324ffba94e5e2016c542bc5d15ead398ce0b50f3e70b525c6a01f4b4b21c93a5" Oct 10 08:09:25 crc kubenswrapper[4732]: I1010 08:09:25.356432 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:09:25 crc kubenswrapper[4732]: I1010 08:09:25.356845 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:09:25 crc kubenswrapper[4732]: I1010 08:09:25.811402 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 10 08:09:32 crc kubenswrapper[4732]: E1010 08:09:32.317620 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a38037b_b925_4eaa_833d_effa7af118ba.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:09:33 crc kubenswrapper[4732]: I1010 08:09:33.033237 4732 generic.go:334] "Generic (PLEG): container finished" podID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerID="23343e4052c7f2073f3a383e220af8a0d2a8fe0cc53723404974313264c4f4a8" exitCode=0 Oct 10 08:09:33 crc kubenswrapper[4732]: I1010 08:09:33.033320 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09d5a68f-9c78-46c7-9291-7e3dfad23f93","Type":"ContainerDied","Data":"23343e4052c7f2073f3a383e220af8a0d2a8fe0cc53723404974313264c4f4a8"} Oct 10 08:09:33 crc kubenswrapper[4732]: I1010 08:09:33.037144 4732 generic.go:334] "Generic (PLEG): container finished" podID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerID="18e4ac8756d3f4bde508164449ac6730f79866e994d4a05d74cd56dcbc62842e" exitCode=0 Oct 10 08:09:33 crc kubenswrapper[4732]: I1010 08:09:33.037234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"25efdbbb-72bc-423d-a78a-d623e1bd6627","Type":"ContainerDied","Data":"18e4ac8756d3f4bde508164449ac6730f79866e994d4a05d74cd56dcbc62842e"} Oct 10 08:09:34 crc kubenswrapper[4732]: I1010 08:09:34.052089 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"25efdbbb-72bc-423d-a78a-d623e1bd6627","Type":"ContainerStarted","Data":"c661315cbd2cde3f8cd2ee28430c91cb06e63ecd70d90c43366b70c562d1940e"} Oct 10 08:09:34 crc kubenswrapper[4732]: I1010 08:09:34.057852 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09d5a68f-9c78-46c7-9291-7e3dfad23f93","Type":"ContainerStarted","Data":"6af0acfd84b833112c738eb7c359d9eb8bffad4e7d959527ca5c3c53251ade28"} Oct 10 08:09:34 crc kubenswrapper[4732]: I1010 08:09:34.058145 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:09:34 crc kubenswrapper[4732]: I1010 08:09:34.091912 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.420986101 podStartE2EDuration="1m3.091892517s" podCreationTimestamp="2025-10-10 08:08:31 +0000 UTC" firstStartedPulling="2025-10-10 08:08:33.61900233 +0000 UTC m=+4640.688593571" lastFinishedPulling="2025-10-10 08:08:59.289908746 +0000 UTC m=+4666.359499987" observedRunningTime="2025-10-10 08:09:34.088568627 +0000 UTC m=+4701.158159888" watchObservedRunningTime="2025-10-10 08:09:34.091892517 +0000 UTC m=+4701.161483758" Oct 10 08:09:34 crc kubenswrapper[4732]: I1010 08:09:34.111336 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.418934656 podStartE2EDuration="1m3.111313441s" podCreationTimestamp="2025-10-10 08:08:31 +0000 UTC" firstStartedPulling="2025-10-10 08:08:33.548673963 +0000 UTC m=+4640.618265204" lastFinishedPulling="2025-10-10 08:08:59.241052748 +0000 UTC m=+4666.310643989" observedRunningTime="2025-10-10 08:09:34.10792768 +0000 UTC m=+4701.177518981" watchObservedRunningTime="2025-10-10 08:09:34.111313441 +0000 UTC m=+4701.180904722" Oct 10 08:09:42 crc kubenswrapper[4732]: E1010 08:09:42.529759 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a38037b_b925_4eaa_833d_effa7af118ba.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:09:43 crc kubenswrapper[4732]: I1010 08:09:43.073685 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:09:43 crc kubenswrapper[4732]: I1010 08:09:43.093326 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 10 08:09:43 crc kubenswrapper[4732]: I1010 08:09:43.097203 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.586893 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8549d7dd49-h2t57"] Oct 10 08:09:48 crc kubenswrapper[4732]: E1010 08:09:48.587835 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ac87e4-6966-4ccc-84e8-ac814d203792" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.587856 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ac87e4-6966-4ccc-84e8-ac814d203792" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: E1010 08:09:48.587890 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="extract-utilities" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.587902 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="extract-utilities" Oct 10 08:09:48 crc kubenswrapper[4732]: E1010 08:09:48.587924 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17434694-200c-4f66-9a6d-9b2a05e63bc7" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.587936 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="17434694-200c-4f66-9a6d-9b2a05e63bc7" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: E1010 08:09:48.587961 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.587973 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: E1010 08:09:48.587997 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerName="dnsmasq-dns" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.588011 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerName="dnsmasq-dns" Oct 10 08:09:48 crc kubenswrapper[4732]: E1010 08:09:48.588032 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="extract-content" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.588044 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="extract-content" Oct 10 08:09:48 crc kubenswrapper[4732]: E1010 08:09:48.588062 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="registry-server" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.588117 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="registry-server" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.588401 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a38037b-b925-4eaa-833d-effa7af118ba" containerName="registry-server" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.588467 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ac87e4-6966-4ccc-84e8-ac814d203792" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.588484 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="85bdd98f-c6f6-4ef2-83b5-8793059cad85" containerName="dnsmasq-dns" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.588507 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="17434694-200c-4f66-9a6d-9b2a05e63bc7" containerName="init" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.590741 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.601028 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8549d7dd49-h2t57"] Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.709071 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrp48\" (UniqueName: \"kubernetes.io/projected/9ff69888-543b-4450-8760-23f55ccbb673-kube-api-access-nrp48\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.709144 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-config\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.709191 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-dns-svc\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.811559 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrp48\" (UniqueName: \"kubernetes.io/projected/9ff69888-543b-4450-8760-23f55ccbb673-kube-api-access-nrp48\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.811772 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-config\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.811908 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-dns-svc\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.813030 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-config\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.813081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-dns-svc\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.833332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrp48\" (UniqueName: \"kubernetes.io/projected/9ff69888-543b-4450-8760-23f55ccbb673-kube-api-access-nrp48\") pod \"dnsmasq-dns-8549d7dd49-h2t57\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:48 crc kubenswrapper[4732]: I1010 08:09:48.918086 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:49 crc kubenswrapper[4732]: I1010 08:09:49.338468 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:09:49 crc kubenswrapper[4732]: I1010 08:09:49.367397 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8549d7dd49-h2t57"] Oct 10 08:09:49 crc kubenswrapper[4732]: I1010 08:09:49.930539 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:09:50 crc kubenswrapper[4732]: I1010 08:09:50.223834 4732 generic.go:334] "Generic (PLEG): container finished" podID="9ff69888-543b-4450-8760-23f55ccbb673" containerID="c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02" exitCode=0 Oct 10 08:09:50 crc kubenswrapper[4732]: I1010 08:09:50.224101 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" event={"ID":"9ff69888-543b-4450-8760-23f55ccbb673","Type":"ContainerDied","Data":"c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02"} Oct 10 08:09:50 crc kubenswrapper[4732]: I1010 08:09:50.224152 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" event={"ID":"9ff69888-543b-4450-8760-23f55ccbb673","Type":"ContainerStarted","Data":"e1bca6895ca2e7e9911a985c434b7508e2c99b1bfe6120f7ae6da34f7800c02d"} Oct 10 08:09:51 crc kubenswrapper[4732]: I1010 08:09:51.257516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" event={"ID":"9ff69888-543b-4450-8760-23f55ccbb673","Type":"ContainerStarted","Data":"ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f"} Oct 10 08:09:51 crc kubenswrapper[4732]: I1010 08:09:51.258059 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:51 crc kubenswrapper[4732]: I1010 08:09:51.284173 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" podStartSLOduration=3.284152518 podStartE2EDuration="3.284152518s" podCreationTimestamp="2025-10-10 08:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:09:51.277820237 +0000 UTC m=+4718.347411518" watchObservedRunningTime="2025-10-10 08:09:51.284152518 +0000 UTC m=+4718.353743779" Oct 10 08:09:52 crc kubenswrapper[4732]: E1010 08:09:52.741435 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a38037b_b925_4eaa_833d_effa7af118ba.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:09:53 crc kubenswrapper[4732]: I1010 08:09:53.523436 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerName="rabbitmq" containerID="cri-o://c661315cbd2cde3f8cd2ee28430c91cb06e63ecd70d90c43366b70c562d1940e" gracePeriod=604796 Oct 10 08:09:54 crc kubenswrapper[4732]: I1010 08:09:54.122533 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerName="rabbitmq" containerID="cri-o://6af0acfd84b833112c738eb7c359d9eb8bffad4e7d959527ca5c3c53251ade28" gracePeriod=604796 Oct 10 08:09:55 crc kubenswrapper[4732]: I1010 08:09:55.356356 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:09:55 crc kubenswrapper[4732]: I1010 08:09:55.356760 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:09:58 crc kubenswrapper[4732]: I1010 08:09:58.919941 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:09:58 crc kubenswrapper[4732]: I1010 08:09:58.988096 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4fc659cc-8k8nj"] Oct 10 08:09:58 crc kubenswrapper[4732]: I1010 08:09:58.988294 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" podUID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerName="dnsmasq-dns" containerID="cri-o://e767a6d58a9c5fd1799020d0852fba2c18b09ae0a642ebea7a82f66c60b9853a" gracePeriod=10 Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.334755 4732 generic.go:334] "Generic (PLEG): container finished" podID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerID="e767a6d58a9c5fd1799020d0852fba2c18b09ae0a642ebea7a82f66c60b9853a" exitCode=0 Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.335085 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" event={"ID":"01731ae6-0e0d-4b75-834a-9ded3d8718d3","Type":"ContainerDied","Data":"e767a6d58a9c5fd1799020d0852fba2c18b09ae0a642ebea7a82f66c60b9853a"} Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.465005 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.592983 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-dns-svc\") pod \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.593088 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-config\") pod \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.593207 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77zx2\" (UniqueName: \"kubernetes.io/projected/01731ae6-0e0d-4b75-834a-9ded3d8718d3-kube-api-access-77zx2\") pod \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\" (UID: \"01731ae6-0e0d-4b75-834a-9ded3d8718d3\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.598295 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01731ae6-0e0d-4b75-834a-9ded3d8718d3-kube-api-access-77zx2" (OuterVolumeSpecName: "kube-api-access-77zx2") pod "01731ae6-0e0d-4b75-834a-9ded3d8718d3" (UID: "01731ae6-0e0d-4b75-834a-9ded3d8718d3"). InnerVolumeSpecName "kube-api-access-77zx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.638223 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-config" (OuterVolumeSpecName: "config") pod "01731ae6-0e0d-4b75-834a-9ded3d8718d3" (UID: "01731ae6-0e0d-4b75-834a-9ded3d8718d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.643253 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01731ae6-0e0d-4b75-834a-9ded3d8718d3" (UID: "01731ae6-0e0d-4b75-834a-9ded3d8718d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.695033 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.695072 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77zx2\" (UniqueName: \"kubernetes.io/projected/01731ae6-0e0d-4b75-834a-9ded3d8718d3-kube-api-access-77zx2\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:09:59.695085 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01731ae6-0e0d-4b75-834a-9ded3d8718d3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.347292 4732 generic.go:334] "Generic (PLEG): container finished" podID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerID="6af0acfd84b833112c738eb7c359d9eb8bffad4e7d959527ca5c3c53251ade28" exitCode=0 Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.347382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09d5a68f-9c78-46c7-9291-7e3dfad23f93","Type":"ContainerDied","Data":"6af0acfd84b833112c738eb7c359d9eb8bffad4e7d959527ca5c3c53251ade28"} Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.351047 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.351766 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d4fc659cc-8k8nj" event={"ID":"01731ae6-0e0d-4b75-834a-9ded3d8718d3","Type":"ContainerDied","Data":"9ee0599ae58670937a2f212206a5a62c0373d1856740c30d1355e98fc10022d6"} Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.351801 4732 scope.go:117] "RemoveContainer" containerID="e767a6d58a9c5fd1799020d0852fba2c18b09ae0a642ebea7a82f66c60b9853a" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.354738 4732 generic.go:334] "Generic (PLEG): container finished" podID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerID="c661315cbd2cde3f8cd2ee28430c91cb06e63ecd70d90c43366b70c562d1940e" exitCode=0 Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.354766 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"25efdbbb-72bc-423d-a78a-d623e1bd6627","Type":"ContainerDied","Data":"c661315cbd2cde3f8cd2ee28430c91cb06e63ecd70d90c43366b70c562d1940e"} Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.354781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"25efdbbb-72bc-423d-a78a-d623e1bd6627","Type":"ContainerDied","Data":"4405eeb21eb6e193ae59d7fddd870fa91bdc2d78d402be2a3b14d76408ced92c"} Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.354814 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4405eeb21eb6e193ae59d7fddd870fa91bdc2d78d402be2a3b14d76408ced92c" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.389847 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.405649 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d4fc659cc-8k8nj"] Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.411285 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d4fc659cc-8k8nj"] Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.462246 4732 scope.go:117] "RemoveContainer" containerID="196c894bc57ba3b08383681e4e1c6c75c2cb75ee81d00389895aecf75d71f317" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506152 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-confd\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25efdbbb-72bc-423d-a78a-d623e1bd6627-pod-info\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506262 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-plugins-conf\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506322 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-tls\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506354 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-server-conf\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506408 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-config-data\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506665 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506742 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25efdbbb-72bc-423d-a78a-d623e1bd6627-erlang-cookie-secret\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.506780 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-plugins\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.507791 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-erlang-cookie\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.507881 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqgsv\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-kube-api-access-hqgsv\") pod \"25efdbbb-72bc-423d-a78a-d623e1bd6627\" (UID: \"25efdbbb-72bc-423d-a78a-d623e1bd6627\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.513600 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.514039 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.516839 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.517055 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25efdbbb-72bc-423d-a78a-d623e1bd6627-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.517299 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/25efdbbb-72bc-423d-a78a-d623e1bd6627-pod-info" (OuterVolumeSpecName: "pod-info") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.517462 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.517608 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-kube-api-access-hqgsv" (OuterVolumeSpecName: "kube-api-access-hqgsv") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "kube-api-access-hqgsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.536447 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3" (OuterVolumeSpecName: "persistence") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.544309 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-config-data" (OuterVolumeSpecName: "config-data") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.568387 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-server-conf" (OuterVolumeSpecName: "server-conf") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611147 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611176 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqgsv\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-kube-api-access-hqgsv\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611185 4732 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/25efdbbb-72bc-423d-a78a-d623e1bd6627-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611194 4732 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611202 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611209 4732 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611241 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") on node \"crc\" " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611252 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/25efdbbb-72bc-423d-a78a-d623e1bd6627-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611262 4732 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/25efdbbb-72bc-423d-a78a-d623e1bd6627-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.611274 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.616035 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "25efdbbb-72bc-423d-a78a-d623e1bd6627" (UID: "25efdbbb-72bc-423d-a78a-d623e1bd6627"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.650937 4732 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.651404 4732 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3") on node "crc" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.701236 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.712089 4732 reconciler_common.go:293] "Volume detached for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.712116 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/25efdbbb-72bc-423d-a78a-d623e1bd6627-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.812862 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-plugins\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813022 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813051 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpltl\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-kube-api-access-bpltl\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813071 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-erlang-cookie\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813108 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09d5a68f-9c78-46c7-9291-7e3dfad23f93-erlang-cookie-secret\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813139 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-plugins-conf\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813156 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-config-data\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813195 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-confd\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813237 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-tls\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813282 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09d5a68f-9c78-46c7-9291-7e3dfad23f93-pod-info\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813290 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813305 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-server-conf\") pod \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\" (UID: \"09d5a68f-9c78-46c7-9291-7e3dfad23f93\") " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813796 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.813972 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.816911 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.829823 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d5a68f-9c78-46c7-9291-7e3dfad23f93-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.835898 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-kube-api-access-bpltl" (OuterVolumeSpecName: "kube-api-access-bpltl") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "kube-api-access-bpltl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.836022 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/09d5a68f-9c78-46c7-9291-7e3dfad23f93-pod-info" (OuterVolumeSpecName: "pod-info") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.841941 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.854182 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-config-data" (OuterVolumeSpecName: "config-data") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.870704 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1" (OuterVolumeSpecName: "persistence") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "pvc-accc41cf-484c-4367-a8de-661f32228ec1". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.885190 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-server-conf" (OuterVolumeSpecName: "server-conf") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915265 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915301 4732 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09d5a68f-9c78-46c7-9291-7e3dfad23f93-pod-info\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915311 4732 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-server-conf\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915347 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") on node \"crc\" " Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915360 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpltl\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-kube-api-access-bpltl\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915371 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915379 4732 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09d5a68f-9c78-46c7-9291-7e3dfad23f93-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915391 4732 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.915401 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/09d5a68f-9c78-46c7-9291-7e3dfad23f93-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.920847 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "09d5a68f-9c78-46c7-9291-7e3dfad23f93" (UID: "09d5a68f-9c78-46c7-9291-7e3dfad23f93"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.936152 4732 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 08:10:00 crc kubenswrapper[4732]: I1010 08:10:00.936359 4732 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-accc41cf-484c-4367-a8de-661f32228ec1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1") on node "crc" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.017301 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09d5a68f-9c78-46c7-9291-7e3dfad23f93-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.017352 4732 reconciler_common.go:293] "Volume detached for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.365489 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09d5a68f-9c78-46c7-9291-7e3dfad23f93","Type":"ContainerDied","Data":"f7af1d44b5d351b4a31b419a93034a1889968f2714e265a1686e3498bd7cbc01"} Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.365781 4732 scope.go:117] "RemoveContainer" containerID="6af0acfd84b833112c738eb7c359d9eb8bffad4e7d959527ca5c3c53251ade28" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.365548 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.365531 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.399720 4732 scope.go:117] "RemoveContainer" containerID="23343e4052c7f2073f3a383e220af8a0d2a8fe0cc53723404974313264c4f4a8" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.419138 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.441756 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.451150 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: E1010 08:10:01.451651 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerName="init" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.451669 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerName="init" Oct 10 08:10:01 crc kubenswrapper[4732]: E1010 08:10:01.451721 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerName="dnsmasq-dns" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.451735 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerName="dnsmasq-dns" Oct 10 08:10:01 crc kubenswrapper[4732]: E1010 08:10:01.451768 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerName="rabbitmq" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.451781 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerName="rabbitmq" Oct 10 08:10:01 crc kubenswrapper[4732]: E1010 08:10:01.451799 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerName="setup-container" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.451821 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerName="setup-container" Oct 10 08:10:01 crc kubenswrapper[4732]: E1010 08:10:01.451841 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerName="setup-container" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.451853 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerName="setup-container" Oct 10 08:10:01 crc kubenswrapper[4732]: E1010 08:10:01.451874 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerName="rabbitmq" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.451887 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerName="rabbitmq" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.452139 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="25efdbbb-72bc-423d-a78a-d623e1bd6627" containerName="rabbitmq" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.452188 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" containerName="dnsmasq-dns" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.452213 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" containerName="rabbitmq" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.453907 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.466990 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.467372 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.467576 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.467973 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.468187 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-58gq6" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.468388 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.468576 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.479135 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.485447 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.501106 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.509155 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.510513 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.513539 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.513735 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.513929 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.513976 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.515386 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.515516 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8vtq6" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.516156 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.520248 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.526584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.526660 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxtcs\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-kube-api-access-fxtcs\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.526681 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.526912 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.526975 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.527090 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.527128 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.527167 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.527281 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.527322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-config-data\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.527364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629111 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629202 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629241 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629258 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629278 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f822368-67da-407d-9a7e-def860134a98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629302 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtwt\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-kube-api-access-8xtwt\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629318 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-config-data\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629416 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629441 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f822368-67da-407d-9a7e-def860134a98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629471 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629503 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629531 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxtcs\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-kube-api-access-fxtcs\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629552 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629572 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629597 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629631 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629647 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.629677 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.630102 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.631056 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-config-data\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.631338 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.631815 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.633332 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.633407 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/95d940685356a8fa009428f47b914765f2e8f1c1e9fd807253f68cbe91376583/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.634975 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.635793 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.635853 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.639458 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.648164 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.652769 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxtcs\" (UniqueName: \"kubernetes.io/projected/9167b3d7-e83f-4e83-8dfe-a1daa954ea9f-kube-api-access-fxtcs\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.675080 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01731ae6-0e0d-4b75-834a-9ded3d8718d3" path="/var/lib/kubelet/pods/01731ae6-0e0d-4b75-834a-9ded3d8718d3/volumes" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.676908 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d5a68f-9c78-46c7-9291-7e3dfad23f93" path="/var/lib/kubelet/pods/09d5a68f-9c78-46c7-9291-7e3dfad23f93/volumes" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.679919 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25efdbbb-72bc-423d-a78a-d623e1bd6627" path="/var/lib/kubelet/pods/25efdbbb-72bc-423d-a78a-d623e1bd6627/volumes" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.680508 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-130c202d-7e5f-4cec-af89-1176ff6c6ed3\") pod \"rabbitmq-server-0\" (UID: \"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f\") " pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.731566 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.732156 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.732439 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.732535 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.732583 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f822368-67da-407d-9a7e-def860134a98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.732622 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtwt\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-kube-api-access-8xtwt\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.733315 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.733378 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.733381 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f822368-67da-407d-9a7e-def860134a98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.733616 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.734454 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.734545 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.734613 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.734756 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.735643 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.735909 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f822368-67da-407d-9a7e-def860134a98-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.737482 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f822368-67da-407d-9a7e-def860134a98-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.738025 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.738179 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e59a483b765e13d49cfb3268cec746d1de826e8104c888b4671a6933e93acf8/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.739367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f822368-67da-407d-9a7e-def860134a98-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.740416 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.741674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.762523 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtwt\" (UniqueName: \"kubernetes.io/projected/3f822368-67da-407d-9a7e-def860134a98-kube-api-access-8xtwt\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.782316 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.787964 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-accc41cf-484c-4367-a8de-661f32228ec1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-accc41cf-484c-4367-a8de-661f32228ec1\") pod \"rabbitmq-cell1-server-0\" (UID: \"3f822368-67da-407d-9a7e-def860134a98\") " pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:01 crc kubenswrapper[4732]: I1010 08:10:01.825940 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:02 crc kubenswrapper[4732]: I1010 08:10:02.086267 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 10 08:10:02 crc kubenswrapper[4732]: I1010 08:10:02.141053 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 10 08:10:02 crc kubenswrapper[4732]: W1010 08:10:02.155979 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f822368_67da_407d_9a7e_def860134a98.slice/crio-998a870ef53aacc5a201f82b0f3646254ae21ab98313ad04c3f6164444d0f12c WatchSource:0}: Error finding container 998a870ef53aacc5a201f82b0f3646254ae21ab98313ad04c3f6164444d0f12c: Status 404 returned error can't find the container with id 998a870ef53aacc5a201f82b0f3646254ae21ab98313ad04c3f6164444d0f12c Oct 10 08:10:02 crc kubenswrapper[4732]: I1010 08:10:02.376090 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f","Type":"ContainerStarted","Data":"ff89f832d2c2eab910842032d69294d6958b3651ed8b34c8b505902ddd285c39"} Oct 10 08:10:02 crc kubenswrapper[4732]: I1010 08:10:02.379094 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f822368-67da-407d-9a7e-def860134a98","Type":"ContainerStarted","Data":"998a870ef53aacc5a201f82b0f3646254ae21ab98313ad04c3f6164444d0f12c"} Oct 10 08:10:02 crc kubenswrapper[4732]: E1010 08:10:02.975538 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a38037b_b925_4eaa_833d_effa7af118ba.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:10:04 crc kubenswrapper[4732]: I1010 08:10:04.411274 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f822368-67da-407d-9a7e-def860134a98","Type":"ContainerStarted","Data":"5e3afd6d99d615e9d9fc19b98b21c27dd1854c3e9be81ebea093f44bac1e57b5"} Oct 10 08:10:04 crc kubenswrapper[4732]: I1010 08:10:04.414771 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f","Type":"ContainerStarted","Data":"a263f97fabf844f8457bdf5be9ee83381352c63e1820e8ce6926ca01bc0de4ff"} Oct 10 08:10:13 crc kubenswrapper[4732]: E1010 08:10:13.205268 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a38037b_b925_4eaa_833d_effa7af118ba.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:10:24 crc kubenswrapper[4732]: I1010 08:10:24.845174 4732 scope.go:117] "RemoveContainer" containerID="bf6e57e4c2138b9c81610d8539c11289786995345a1a897bd1777ac0d5bd0f5c" Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.356288 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.356764 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.356854 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.357743 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"414b67626caa2a3a0941b29c620d3e844d3d0e2537b4a836117644eb6baed081"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.357856 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://414b67626caa2a3a0941b29c620d3e844d3d0e2537b4a836117644eb6baed081" gracePeriod=600 Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.628113 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="414b67626caa2a3a0941b29c620d3e844d3d0e2537b4a836117644eb6baed081" exitCode=0 Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.628159 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"414b67626caa2a3a0941b29c620d3e844d3d0e2537b4a836117644eb6baed081"} Oct 10 08:10:25 crc kubenswrapper[4732]: I1010 08:10:25.628201 4732 scope.go:117] "RemoveContainer" containerID="7d622e876dfc78f2b592232a8fc95d3ed7d14902279f0a3f20b4cb3da0592629" Oct 10 08:10:26 crc kubenswrapper[4732]: I1010 08:10:26.635781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea"} Oct 10 08:10:36 crc kubenswrapper[4732]: I1010 08:10:36.728332 4732 generic.go:334] "Generic (PLEG): container finished" podID="9167b3d7-e83f-4e83-8dfe-a1daa954ea9f" containerID="a263f97fabf844f8457bdf5be9ee83381352c63e1820e8ce6926ca01bc0de4ff" exitCode=0 Oct 10 08:10:36 crc kubenswrapper[4732]: I1010 08:10:36.728428 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f","Type":"ContainerDied","Data":"a263f97fabf844f8457bdf5be9ee83381352c63e1820e8ce6926ca01bc0de4ff"} Oct 10 08:10:37 crc kubenswrapper[4732]: I1010 08:10:37.739061 4732 generic.go:334] "Generic (PLEG): container finished" podID="3f822368-67da-407d-9a7e-def860134a98" containerID="5e3afd6d99d615e9d9fc19b98b21c27dd1854c3e9be81ebea093f44bac1e57b5" exitCode=0 Oct 10 08:10:37 crc kubenswrapper[4732]: I1010 08:10:37.739116 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f822368-67da-407d-9a7e-def860134a98","Type":"ContainerDied","Data":"5e3afd6d99d615e9d9fc19b98b21c27dd1854c3e9be81ebea093f44bac1e57b5"} Oct 10 08:10:37 crc kubenswrapper[4732]: I1010 08:10:37.741465 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9167b3d7-e83f-4e83-8dfe-a1daa954ea9f","Type":"ContainerStarted","Data":"c55ec478e7a5bd72564b242c1feb78e4a0617ca6364930b5e122fb439aa6b3a4"} Oct 10 08:10:37 crc kubenswrapper[4732]: I1010 08:10:37.741739 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 10 08:10:37 crc kubenswrapper[4732]: I1010 08:10:37.803355 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.803331712 podStartE2EDuration="36.803331712s" podCreationTimestamp="2025-10-10 08:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:10:37.791860722 +0000 UTC m=+4764.861451983" watchObservedRunningTime="2025-10-10 08:10:37.803331712 +0000 UTC m=+4764.872922993" Oct 10 08:10:38 crc kubenswrapper[4732]: I1010 08:10:38.751601 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3f822368-67da-407d-9a7e-def860134a98","Type":"ContainerStarted","Data":"14a276140f351e5f8bf4ec39bcf65b44328f3e35dcace708ef89e6ae57b9b2f9"} Oct 10 08:10:38 crc kubenswrapper[4732]: I1010 08:10:38.752323 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:38 crc kubenswrapper[4732]: I1010 08:10:38.779543 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.779522795 podStartE2EDuration="37.779522795s" podCreationTimestamp="2025-10-10 08:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:10:38.771007266 +0000 UTC m=+4765.840598517" watchObservedRunningTime="2025-10-10 08:10:38.779522795 +0000 UTC m=+4765.849114046" Oct 10 08:10:51 crc kubenswrapper[4732]: I1010 08:10:51.786044 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 10 08:10:51 crc kubenswrapper[4732]: I1010 08:10:51.828878 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.640053 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.643291 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.653003 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5ddsv" Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.659642 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.710437 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwbk\" (UniqueName: \"kubernetes.io/projected/0ac3d337-332d-453c-87b9-5d9a7d8dcc8e-kube-api-access-cqwbk\") pod \"mariadb-client-1-default\" (UID: \"0ac3d337-332d-453c-87b9-5d9a7d8dcc8e\") " pod="openstack/mariadb-client-1-default" Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.812081 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwbk\" (UniqueName: \"kubernetes.io/projected/0ac3d337-332d-453c-87b9-5d9a7d8dcc8e-kube-api-access-cqwbk\") pod \"mariadb-client-1-default\" (UID: \"0ac3d337-332d-453c-87b9-5d9a7d8dcc8e\") " pod="openstack/mariadb-client-1-default" Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.840658 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwbk\" (UniqueName: \"kubernetes.io/projected/0ac3d337-332d-453c-87b9-5d9a7d8dcc8e-kube-api-access-cqwbk\") pod \"mariadb-client-1-default\" (UID: \"0ac3d337-332d-453c-87b9-5d9a7d8dcc8e\") " pod="openstack/mariadb-client-1-default" Oct 10 08:10:55 crc kubenswrapper[4732]: I1010 08:10:55.987905 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 08:10:56 crc kubenswrapper[4732]: I1010 08:10:56.585109 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 08:10:56 crc kubenswrapper[4732]: I1010 08:10:56.931576 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"0ac3d337-332d-453c-87b9-5d9a7d8dcc8e","Type":"ContainerStarted","Data":"b3936c23b52edcaa284cae6009a98144bd86a9a47dc23200f3780ea4cce864e5"} Oct 10 08:10:57 crc kubenswrapper[4732]: I1010 08:10:57.944678 4732 generic.go:334] "Generic (PLEG): container finished" podID="0ac3d337-332d-453c-87b9-5d9a7d8dcc8e" containerID="4f5bcd75bf79349d4e950bb6523440387bf77ad27099b5898abcdbb58bce6260" exitCode=0 Oct 10 08:10:57 crc kubenswrapper[4732]: I1010 08:10:57.944772 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"0ac3d337-332d-453c-87b9-5d9a7d8dcc8e","Type":"ContainerDied","Data":"4f5bcd75bf79349d4e950bb6523440387bf77ad27099b5898abcdbb58bce6260"} Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.376183 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.407333 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_0ac3d337-332d-453c-87b9-5d9a7d8dcc8e/mariadb-client-1-default/0.log" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.440628 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.448432 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.477475 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqwbk\" (UniqueName: \"kubernetes.io/projected/0ac3d337-332d-453c-87b9-5d9a7d8dcc8e-kube-api-access-cqwbk\") pod \"0ac3d337-332d-453c-87b9-5d9a7d8dcc8e\" (UID: \"0ac3d337-332d-453c-87b9-5d9a7d8dcc8e\") " Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.486906 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac3d337-332d-453c-87b9-5d9a7d8dcc8e-kube-api-access-cqwbk" (OuterVolumeSpecName: "kube-api-access-cqwbk") pod "0ac3d337-332d-453c-87b9-5d9a7d8dcc8e" (UID: "0ac3d337-332d-453c-87b9-5d9a7d8dcc8e"). InnerVolumeSpecName "kube-api-access-cqwbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.580644 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqwbk\" (UniqueName: \"kubernetes.io/projected/0ac3d337-332d-453c-87b9-5d9a7d8dcc8e-kube-api-access-cqwbk\") on node \"crc\" DevicePath \"\"" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.670565 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac3d337-332d-453c-87b9-5d9a7d8dcc8e" path="/var/lib/kubelet/pods/0ac3d337-332d-453c-87b9-5d9a7d8dcc8e/volumes" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.838924 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 08:10:59 crc kubenswrapper[4732]: E1010 08:10:59.839355 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac3d337-332d-453c-87b9-5d9a7d8dcc8e" containerName="mariadb-client-1-default" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.839370 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac3d337-332d-453c-87b9-5d9a7d8dcc8e" containerName="mariadb-client-1-default" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.839558 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac3d337-332d-453c-87b9-5d9a7d8dcc8e" containerName="mariadb-client-1-default" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.840335 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.846427 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.885228 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tfk\" (UniqueName: \"kubernetes.io/projected/f92f2a3b-8fef-46be-9652-08b81cd0746a-kube-api-access-95tfk\") pod \"mariadb-client-2-default\" (UID: \"f92f2a3b-8fef-46be-9652-08b81cd0746a\") " pod="openstack/mariadb-client-2-default" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.967678 4732 scope.go:117] "RemoveContainer" containerID="4f5bcd75bf79349d4e950bb6523440387bf77ad27099b5898abcdbb58bce6260" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.967762 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 10 08:10:59 crc kubenswrapper[4732]: I1010 08:10:59.988680 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95tfk\" (UniqueName: \"kubernetes.io/projected/f92f2a3b-8fef-46be-9652-08b81cd0746a-kube-api-access-95tfk\") pod \"mariadb-client-2-default\" (UID: \"f92f2a3b-8fef-46be-9652-08b81cd0746a\") " pod="openstack/mariadb-client-2-default" Oct 10 08:11:00 crc kubenswrapper[4732]: I1010 08:11:00.019041 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95tfk\" (UniqueName: \"kubernetes.io/projected/f92f2a3b-8fef-46be-9652-08b81cd0746a-kube-api-access-95tfk\") pod \"mariadb-client-2-default\" (UID: \"f92f2a3b-8fef-46be-9652-08b81cd0746a\") " pod="openstack/mariadb-client-2-default" Oct 10 08:11:00 crc kubenswrapper[4732]: I1010 08:11:00.170028 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 08:11:00 crc kubenswrapper[4732]: I1010 08:11:00.805633 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 08:11:00 crc kubenswrapper[4732]: I1010 08:11:00.982323 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"f92f2a3b-8fef-46be-9652-08b81cd0746a","Type":"ContainerStarted","Data":"149eaef9b93aa8fc43a9d660648e81811b05e1750b9f2dd535a92218586c0747"} Oct 10 08:11:01 crc kubenswrapper[4732]: I1010 08:11:01.996240 4732 generic.go:334] "Generic (PLEG): container finished" podID="f92f2a3b-8fef-46be-9652-08b81cd0746a" containerID="22fdd4290796499d5ff8d0657fed3bcaa3e8a805f63cd251a664d0d2fd3f9442" exitCode=0 Oct 10 08:11:01 crc kubenswrapper[4732]: I1010 08:11:01.996328 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"f92f2a3b-8fef-46be-9652-08b81cd0746a","Type":"ContainerDied","Data":"22fdd4290796499d5ff8d0657fed3bcaa3e8a805f63cd251a664d0d2fd3f9442"} Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.442206 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.493808 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_f92f2a3b-8fef-46be-9652-08b81cd0746a/mariadb-client-2-default/0.log" Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.528971 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.538443 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.549580 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95tfk\" (UniqueName: \"kubernetes.io/projected/f92f2a3b-8fef-46be-9652-08b81cd0746a-kube-api-access-95tfk\") pod \"f92f2a3b-8fef-46be-9652-08b81cd0746a\" (UID: \"f92f2a3b-8fef-46be-9652-08b81cd0746a\") " Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.557923 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92f2a3b-8fef-46be-9652-08b81cd0746a-kube-api-access-95tfk" (OuterVolumeSpecName: "kube-api-access-95tfk") pod "f92f2a3b-8fef-46be-9652-08b81cd0746a" (UID: "f92f2a3b-8fef-46be-9652-08b81cd0746a"). InnerVolumeSpecName "kube-api-access-95tfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.650918 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95tfk\" (UniqueName: \"kubernetes.io/projected/f92f2a3b-8fef-46be-9652-08b81cd0746a-kube-api-access-95tfk\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:03 crc kubenswrapper[4732]: I1010 08:11:03.674666 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f92f2a3b-8fef-46be-9652-08b81cd0746a" path="/var/lib/kubelet/pods/f92f2a3b-8fef-46be-9652-08b81cd0746a/volumes" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.020300 4732 scope.go:117] "RemoveContainer" containerID="22fdd4290796499d5ff8d0657fed3bcaa3e8a805f63cd251a664d0d2fd3f9442" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.020638 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.063077 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 10 08:11:04 crc kubenswrapper[4732]: E1010 08:11:04.064027 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92f2a3b-8fef-46be-9652-08b81cd0746a" containerName="mariadb-client-2-default" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.064084 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92f2a3b-8fef-46be-9652-08b81cd0746a" containerName="mariadb-client-2-default" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.064454 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92f2a3b-8fef-46be-9652-08b81cd0746a" containerName="mariadb-client-2-default" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.065095 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.068247 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5ddsv" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.076834 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.160813 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm9k5\" (UniqueName: \"kubernetes.io/projected/2a8176b4-3902-418e-9cbe-39886135c176-kube-api-access-mm9k5\") pod \"mariadb-client-1\" (UID: \"2a8176b4-3902-418e-9cbe-39886135c176\") " pod="openstack/mariadb-client-1" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.263831 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm9k5\" (UniqueName: \"kubernetes.io/projected/2a8176b4-3902-418e-9cbe-39886135c176-kube-api-access-mm9k5\") pod \"mariadb-client-1\" (UID: \"2a8176b4-3902-418e-9cbe-39886135c176\") " pod="openstack/mariadb-client-1" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.300481 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm9k5\" (UniqueName: \"kubernetes.io/projected/2a8176b4-3902-418e-9cbe-39886135c176-kube-api-access-mm9k5\") pod \"mariadb-client-1\" (UID: \"2a8176b4-3902-418e-9cbe-39886135c176\") " pod="openstack/mariadb-client-1" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.417099 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 08:11:04 crc kubenswrapper[4732]: I1010 08:11:04.756064 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 08:11:04 crc kubenswrapper[4732]: W1010 08:11:04.765070 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8176b4_3902_418e_9cbe_39886135c176.slice/crio-b0a9434f867fb16444d6e605f33c97e6c10016563aed848a690707627087204c WatchSource:0}: Error finding container b0a9434f867fb16444d6e605f33c97e6c10016563aed848a690707627087204c: Status 404 returned error can't find the container with id b0a9434f867fb16444d6e605f33c97e6c10016563aed848a690707627087204c Oct 10 08:11:05 crc kubenswrapper[4732]: I1010 08:11:05.041786 4732 generic.go:334] "Generic (PLEG): container finished" podID="2a8176b4-3902-418e-9cbe-39886135c176" containerID="1d4e2cf8adad45949ecbfcd79d98d5f00081a858bed077f5351890df31b71549" exitCode=0 Oct 10 08:11:05 crc kubenswrapper[4732]: I1010 08:11:05.041949 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2a8176b4-3902-418e-9cbe-39886135c176","Type":"ContainerDied","Data":"1d4e2cf8adad45949ecbfcd79d98d5f00081a858bed077f5351890df31b71549"} Oct 10 08:11:05 crc kubenswrapper[4732]: I1010 08:11:05.041992 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2a8176b4-3902-418e-9cbe-39886135c176","Type":"ContainerStarted","Data":"b0a9434f867fb16444d6e605f33c97e6c10016563aed848a690707627087204c"} Oct 10 08:11:06 crc kubenswrapper[4732]: I1010 08:11:06.502579 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 08:11:06 crc kubenswrapper[4732]: I1010 08:11:06.523308 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_2a8176b4-3902-418e-9cbe-39886135c176/mariadb-client-1/0.log" Oct 10 08:11:06 crc kubenswrapper[4732]: I1010 08:11:06.555357 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 08:11:06 crc kubenswrapper[4732]: I1010 08:11:06.565444 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 10 08:11:06 crc kubenswrapper[4732]: I1010 08:11:06.601063 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm9k5\" (UniqueName: \"kubernetes.io/projected/2a8176b4-3902-418e-9cbe-39886135c176-kube-api-access-mm9k5\") pod \"2a8176b4-3902-418e-9cbe-39886135c176\" (UID: \"2a8176b4-3902-418e-9cbe-39886135c176\") " Oct 10 08:11:06 crc kubenswrapper[4732]: I1010 08:11:06.606338 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8176b4-3902-418e-9cbe-39886135c176-kube-api-access-mm9k5" (OuterVolumeSpecName: "kube-api-access-mm9k5") pod "2a8176b4-3902-418e-9cbe-39886135c176" (UID: "2a8176b4-3902-418e-9cbe-39886135c176"). InnerVolumeSpecName "kube-api-access-mm9k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:06 crc kubenswrapper[4732]: I1010 08:11:06.703148 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm9k5\" (UniqueName: \"kubernetes.io/projected/2a8176b4-3902-418e-9cbe-39886135c176-kube-api-access-mm9k5\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.027579 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 08:11:07 crc kubenswrapper[4732]: E1010 08:11:07.028265 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8176b4-3902-418e-9cbe-39886135c176" containerName="mariadb-client-1" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.028302 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8176b4-3902-418e-9cbe-39886135c176" containerName="mariadb-client-1" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.028643 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8176b4-3902-418e-9cbe-39886135c176" containerName="mariadb-client-1" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.029534 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.038959 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.072533 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0a9434f867fb16444d6e605f33c97e6c10016563aed848a690707627087204c" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.072644 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.112619 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhlds\" (UniqueName: \"kubernetes.io/projected/96a90a56-a2c8-46e4-810a-cc8ef4e98e67-kube-api-access-dhlds\") pod \"mariadb-client-4-default\" (UID: \"96a90a56-a2c8-46e4-810a-cc8ef4e98e67\") " pod="openstack/mariadb-client-4-default" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.214563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhlds\" (UniqueName: \"kubernetes.io/projected/96a90a56-a2c8-46e4-810a-cc8ef4e98e67-kube-api-access-dhlds\") pod \"mariadb-client-4-default\" (UID: \"96a90a56-a2c8-46e4-810a-cc8ef4e98e67\") " pod="openstack/mariadb-client-4-default" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.239254 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhlds\" (UniqueName: \"kubernetes.io/projected/96a90a56-a2c8-46e4-810a-cc8ef4e98e67-kube-api-access-dhlds\") pod \"mariadb-client-4-default\" (UID: \"96a90a56-a2c8-46e4-810a-cc8ef4e98e67\") " pod="openstack/mariadb-client-4-default" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.363582 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.670879 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8176b4-3902-418e-9cbe-39886135c176" path="/var/lib/kubelet/pods/2a8176b4-3902-418e-9cbe-39886135c176/volumes" Oct 10 08:11:07 crc kubenswrapper[4732]: I1010 08:11:07.678172 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 08:11:08 crc kubenswrapper[4732]: I1010 08:11:08.084782 4732 generic.go:334] "Generic (PLEG): container finished" podID="96a90a56-a2c8-46e4-810a-cc8ef4e98e67" containerID="bc67547b63be0e794c508a2d8408e3721454031ef4b0785f86d647a774ba4a32" exitCode=0 Oct 10 08:11:08 crc kubenswrapper[4732]: I1010 08:11:08.084847 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"96a90a56-a2c8-46e4-810a-cc8ef4e98e67","Type":"ContainerDied","Data":"bc67547b63be0e794c508a2d8408e3721454031ef4b0785f86d647a774ba4a32"} Oct 10 08:11:08 crc kubenswrapper[4732]: I1010 08:11:08.084891 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"96a90a56-a2c8-46e4-810a-cc8ef4e98e67","Type":"ContainerStarted","Data":"edcdca1ac8091540b9b7c7492b6439a326847abb59b2bc267bf8a2862cb87855"} Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.518859 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.537621 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_96a90a56-a2c8-46e4-810a-cc8ef4e98e67/mariadb-client-4-default/0.log" Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.555128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhlds\" (UniqueName: \"kubernetes.io/projected/96a90a56-a2c8-46e4-810a-cc8ef4e98e67-kube-api-access-dhlds\") pod \"96a90a56-a2c8-46e4-810a-cc8ef4e98e67\" (UID: \"96a90a56-a2c8-46e4-810a-cc8ef4e98e67\") " Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.566225 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.568207 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a90a56-a2c8-46e4-810a-cc8ef4e98e67-kube-api-access-dhlds" (OuterVolumeSpecName: "kube-api-access-dhlds") pod "96a90a56-a2c8-46e4-810a-cc8ef4e98e67" (UID: "96a90a56-a2c8-46e4-810a-cc8ef4e98e67"). InnerVolumeSpecName "kube-api-access-dhlds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.571856 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.656900 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhlds\" (UniqueName: \"kubernetes.io/projected/96a90a56-a2c8-46e4-810a-cc8ef4e98e67-kube-api-access-dhlds\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:09 crc kubenswrapper[4732]: I1010 08:11:09.668232 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a90a56-a2c8-46e4-810a-cc8ef4e98e67" path="/var/lib/kubelet/pods/96a90a56-a2c8-46e4-810a-cc8ef4e98e67/volumes" Oct 10 08:11:10 crc kubenswrapper[4732]: I1010 08:11:10.110079 4732 scope.go:117] "RemoveContainer" containerID="bc67547b63be0e794c508a2d8408e3721454031ef4b0785f86d647a774ba4a32" Oct 10 08:11:10 crc kubenswrapper[4732]: I1010 08:11:10.110135 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.738767 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 08:11:13 crc kubenswrapper[4732]: E1010 08:11:13.739870 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a90a56-a2c8-46e4-810a-cc8ef4e98e67" containerName="mariadb-client-4-default" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.739912 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a90a56-a2c8-46e4-810a-cc8ef4e98e67" containerName="mariadb-client-4-default" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.740196 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a90a56-a2c8-46e4-810a-cc8ef4e98e67" containerName="mariadb-client-4-default" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.741318 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.743747 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5ddsv" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.746917 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.820390 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2cpj\" (UniqueName: \"kubernetes.io/projected/c0745cfa-2275-4931-8677-ef3647e759a0-kube-api-access-l2cpj\") pod \"mariadb-client-5-default\" (UID: \"c0745cfa-2275-4931-8677-ef3647e759a0\") " pod="openstack/mariadb-client-5-default" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.922423 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2cpj\" (UniqueName: \"kubernetes.io/projected/c0745cfa-2275-4931-8677-ef3647e759a0-kube-api-access-l2cpj\") pod \"mariadb-client-5-default\" (UID: \"c0745cfa-2275-4931-8677-ef3647e759a0\") " pod="openstack/mariadb-client-5-default" Oct 10 08:11:13 crc kubenswrapper[4732]: I1010 08:11:13.946919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2cpj\" (UniqueName: \"kubernetes.io/projected/c0745cfa-2275-4931-8677-ef3647e759a0-kube-api-access-l2cpj\") pod \"mariadb-client-5-default\" (UID: \"c0745cfa-2275-4931-8677-ef3647e759a0\") " pod="openstack/mariadb-client-5-default" Oct 10 08:11:14 crc kubenswrapper[4732]: I1010 08:11:14.058249 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 08:11:14 crc kubenswrapper[4732]: I1010 08:11:14.638165 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 08:11:15 crc kubenswrapper[4732]: I1010 08:11:15.165835 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0745cfa-2275-4931-8677-ef3647e759a0" containerID="33037f3859291d7fa36d46805b0d806e2d0a7d5a3b74bed1d4e9f3c6d6d27c41" exitCode=0 Oct 10 08:11:15 crc kubenswrapper[4732]: I1010 08:11:15.165954 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c0745cfa-2275-4931-8677-ef3647e759a0","Type":"ContainerDied","Data":"33037f3859291d7fa36d46805b0d806e2d0a7d5a3b74bed1d4e9f3c6d6d27c41"} Oct 10 08:11:15 crc kubenswrapper[4732]: I1010 08:11:15.166270 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c0745cfa-2275-4931-8677-ef3647e759a0","Type":"ContainerStarted","Data":"6abc7912abb4558c7783c2ec12bd8e9da886ec4d686eeaa1ac3af369d1de0ac8"} Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.555460 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.576991 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_c0745cfa-2275-4931-8677-ef3647e759a0/mariadb-client-5-default/0.log" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.608117 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.617643 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.665771 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2cpj\" (UniqueName: \"kubernetes.io/projected/c0745cfa-2275-4931-8677-ef3647e759a0-kube-api-access-l2cpj\") pod \"c0745cfa-2275-4931-8677-ef3647e759a0\" (UID: \"c0745cfa-2275-4931-8677-ef3647e759a0\") " Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.672099 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0745cfa-2275-4931-8677-ef3647e759a0-kube-api-access-l2cpj" (OuterVolumeSpecName: "kube-api-access-l2cpj") pod "c0745cfa-2275-4931-8677-ef3647e759a0" (UID: "c0745cfa-2275-4931-8677-ef3647e759a0"). InnerVolumeSpecName "kube-api-access-l2cpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.769059 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2cpj\" (UniqueName: \"kubernetes.io/projected/c0745cfa-2275-4931-8677-ef3647e759a0-kube-api-access-l2cpj\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.806682 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 08:11:16 crc kubenswrapper[4732]: E1010 08:11:16.807443 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0745cfa-2275-4931-8677-ef3647e759a0" containerName="mariadb-client-5-default" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.807603 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0745cfa-2275-4931-8677-ef3647e759a0" containerName="mariadb-client-5-default" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.808052 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0745cfa-2275-4931-8677-ef3647e759a0" containerName="mariadb-client-5-default" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.808977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.820645 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.870341 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpdl2\" (UniqueName: \"kubernetes.io/projected/dc40afb8-8b6a-47a2-ab85-251f4da68b6b-kube-api-access-tpdl2\") pod \"mariadb-client-6-default\" (UID: \"dc40afb8-8b6a-47a2-ab85-251f4da68b6b\") " pod="openstack/mariadb-client-6-default" Oct 10 08:11:16 crc kubenswrapper[4732]: I1010 08:11:16.972340 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpdl2\" (UniqueName: \"kubernetes.io/projected/dc40afb8-8b6a-47a2-ab85-251f4da68b6b-kube-api-access-tpdl2\") pod \"mariadb-client-6-default\" (UID: \"dc40afb8-8b6a-47a2-ab85-251f4da68b6b\") " pod="openstack/mariadb-client-6-default" Oct 10 08:11:17 crc kubenswrapper[4732]: I1010 08:11:17.002029 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpdl2\" (UniqueName: \"kubernetes.io/projected/dc40afb8-8b6a-47a2-ab85-251f4da68b6b-kube-api-access-tpdl2\") pod \"mariadb-client-6-default\" (UID: \"dc40afb8-8b6a-47a2-ab85-251f4da68b6b\") " pod="openstack/mariadb-client-6-default" Oct 10 08:11:17 crc kubenswrapper[4732]: I1010 08:11:17.140875 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 08:11:17 crc kubenswrapper[4732]: I1010 08:11:17.189315 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6abc7912abb4558c7783c2ec12bd8e9da886ec4d686eeaa1ac3af369d1de0ac8" Oct 10 08:11:17 crc kubenswrapper[4732]: I1010 08:11:17.189383 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 10 08:11:17 crc kubenswrapper[4732]: I1010 08:11:17.672877 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0745cfa-2275-4931-8677-ef3647e759a0" path="/var/lib/kubelet/pods/c0745cfa-2275-4931-8677-ef3647e759a0/volumes" Oct 10 08:11:17 crc kubenswrapper[4732]: I1010 08:11:17.743315 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 08:11:18 crc kubenswrapper[4732]: I1010 08:11:18.200767 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"dc40afb8-8b6a-47a2-ab85-251f4da68b6b","Type":"ContainerStarted","Data":"8df65e0cacecef098dd26b10eba67b84a0946664edec8a0d70f2a7dd9037cabc"} Oct 10 08:11:18 crc kubenswrapper[4732]: I1010 08:11:18.201076 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"dc40afb8-8b6a-47a2-ab85-251f4da68b6b","Type":"ContainerStarted","Data":"1701faef1056a86c116ed515582934c146913629975bdb69735beb1772223077"} Oct 10 08:11:18 crc kubenswrapper[4732]: I1010 08:11:18.240266 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.240235685 podStartE2EDuration="2.240235685s" podCreationTimestamp="2025-10-10 08:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:11:18.228819707 +0000 UTC m=+4805.298411018" watchObservedRunningTime="2025-10-10 08:11:18.240235685 +0000 UTC m=+4805.309826956" Oct 10 08:11:19 crc kubenswrapper[4732]: I1010 08:11:19.209370 4732 generic.go:334] "Generic (PLEG): container finished" podID="dc40afb8-8b6a-47a2-ab85-251f4da68b6b" containerID="8df65e0cacecef098dd26b10eba67b84a0946664edec8a0d70f2a7dd9037cabc" exitCode=0 Oct 10 08:11:19 crc kubenswrapper[4732]: I1010 08:11:19.209427 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"dc40afb8-8b6a-47a2-ab85-251f4da68b6b","Type":"ContainerDied","Data":"8df65e0cacecef098dd26b10eba67b84a0946664edec8a0d70f2a7dd9037cabc"} Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.619612 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.667852 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.676322 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.750737 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpdl2\" (UniqueName: \"kubernetes.io/projected/dc40afb8-8b6a-47a2-ab85-251f4da68b6b-kube-api-access-tpdl2\") pod \"dc40afb8-8b6a-47a2-ab85-251f4da68b6b\" (UID: \"dc40afb8-8b6a-47a2-ab85-251f4da68b6b\") " Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.757717 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc40afb8-8b6a-47a2-ab85-251f4da68b6b-kube-api-access-tpdl2" (OuterVolumeSpecName: "kube-api-access-tpdl2") pod "dc40afb8-8b6a-47a2-ab85-251f4da68b6b" (UID: "dc40afb8-8b6a-47a2-ab85-251f4da68b6b"). InnerVolumeSpecName "kube-api-access-tpdl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.854223 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpdl2\" (UniqueName: \"kubernetes.io/projected/dc40afb8-8b6a-47a2-ab85-251f4da68b6b-kube-api-access-tpdl2\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.857811 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 08:11:20 crc kubenswrapper[4732]: E1010 08:11:20.858272 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc40afb8-8b6a-47a2-ab85-251f4da68b6b" containerName="mariadb-client-6-default" Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.858290 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc40afb8-8b6a-47a2-ab85-251f4da68b6b" containerName="mariadb-client-6-default" Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.858488 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc40afb8-8b6a-47a2-ab85-251f4da68b6b" containerName="mariadb-client-6-default" Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.859058 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.873412 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 08:11:20 crc kubenswrapper[4732]: I1010 08:11:20.956184 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdctf\" (UniqueName: \"kubernetes.io/projected/363e9216-b2c6-4e45-9347-617ee01529f4-kube-api-access-tdctf\") pod \"mariadb-client-7-default\" (UID: \"363e9216-b2c6-4e45-9347-617ee01529f4\") " pod="openstack/mariadb-client-7-default" Oct 10 08:11:21 crc kubenswrapper[4732]: I1010 08:11:21.058588 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdctf\" (UniqueName: \"kubernetes.io/projected/363e9216-b2c6-4e45-9347-617ee01529f4-kube-api-access-tdctf\") pod \"mariadb-client-7-default\" (UID: \"363e9216-b2c6-4e45-9347-617ee01529f4\") " pod="openstack/mariadb-client-7-default" Oct 10 08:11:21 crc kubenswrapper[4732]: I1010 08:11:21.085205 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdctf\" (UniqueName: \"kubernetes.io/projected/363e9216-b2c6-4e45-9347-617ee01529f4-kube-api-access-tdctf\") pod \"mariadb-client-7-default\" (UID: \"363e9216-b2c6-4e45-9347-617ee01529f4\") " pod="openstack/mariadb-client-7-default" Oct 10 08:11:21 crc kubenswrapper[4732]: I1010 08:11:21.177054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 08:11:21 crc kubenswrapper[4732]: I1010 08:11:21.230789 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1701faef1056a86c116ed515582934c146913629975bdb69735beb1772223077" Oct 10 08:11:21 crc kubenswrapper[4732]: I1010 08:11:21.230914 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 10 08:11:21 crc kubenswrapper[4732]: I1010 08:11:21.696660 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc40afb8-8b6a-47a2-ab85-251f4da68b6b" path="/var/lib/kubelet/pods/dc40afb8-8b6a-47a2-ab85-251f4da68b6b/volumes" Oct 10 08:11:21 crc kubenswrapper[4732]: I1010 08:11:21.767821 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 08:11:22 crc kubenswrapper[4732]: I1010 08:11:22.242392 4732 generic.go:334] "Generic (PLEG): container finished" podID="363e9216-b2c6-4e45-9347-617ee01529f4" containerID="2a3b22601a53e91620ec3a3d0fc638a17592e7470d620efd1f62f3cc628814aa" exitCode=0 Oct 10 08:11:22 crc kubenswrapper[4732]: I1010 08:11:22.242477 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"363e9216-b2c6-4e45-9347-617ee01529f4","Type":"ContainerDied","Data":"2a3b22601a53e91620ec3a3d0fc638a17592e7470d620efd1f62f3cc628814aa"} Oct 10 08:11:22 crc kubenswrapper[4732]: I1010 08:11:22.242715 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"363e9216-b2c6-4e45-9347-617ee01529f4","Type":"ContainerStarted","Data":"a4a4f66dd843c1cf8938fe3ce09cf8c4678177b0e59094c8edfd2770c27169cd"} Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.770764 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.789738 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_363e9216-b2c6-4e45-9347-617ee01529f4/mariadb-client-7-default/0.log" Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.815990 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.824650 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.922634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdctf\" (UniqueName: \"kubernetes.io/projected/363e9216-b2c6-4e45-9347-617ee01529f4-kube-api-access-tdctf\") pod \"363e9216-b2c6-4e45-9347-617ee01529f4\" (UID: \"363e9216-b2c6-4e45-9347-617ee01529f4\") " Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.929514 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/363e9216-b2c6-4e45-9347-617ee01529f4-kube-api-access-tdctf" (OuterVolumeSpecName: "kube-api-access-tdctf") pod "363e9216-b2c6-4e45-9347-617ee01529f4" (UID: "363e9216-b2c6-4e45-9347-617ee01529f4"). InnerVolumeSpecName "kube-api-access-tdctf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.997137 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 10 08:11:23 crc kubenswrapper[4732]: E1010 08:11:23.997774 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363e9216-b2c6-4e45-9347-617ee01529f4" containerName="mariadb-client-7-default" Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.997814 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="363e9216-b2c6-4e45-9347-617ee01529f4" containerName="mariadb-client-7-default" Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.998202 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="363e9216-b2c6-4e45-9347-617ee01529f4" containerName="mariadb-client-7-default" Oct 10 08:11:23 crc kubenswrapper[4732]: I1010 08:11:23.999095 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.005262 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.024096 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdctf\" (UniqueName: \"kubernetes.io/projected/363e9216-b2c6-4e45-9347-617ee01529f4-kube-api-access-tdctf\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.126916 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbvlj\" (UniqueName: \"kubernetes.io/projected/c42e3650-3594-404e-bd6c-adcc2d2cc0bf-kube-api-access-pbvlj\") pod \"mariadb-client-2\" (UID: \"c42e3650-3594-404e-bd6c-adcc2d2cc0bf\") " pod="openstack/mariadb-client-2" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.228355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbvlj\" (UniqueName: \"kubernetes.io/projected/c42e3650-3594-404e-bd6c-adcc2d2cc0bf-kube-api-access-pbvlj\") pod \"mariadb-client-2\" (UID: \"c42e3650-3594-404e-bd6c-adcc2d2cc0bf\") " pod="openstack/mariadb-client-2" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.252487 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbvlj\" (UniqueName: \"kubernetes.io/projected/c42e3650-3594-404e-bd6c-adcc2d2cc0bf-kube-api-access-pbvlj\") pod \"mariadb-client-2\" (UID: \"c42e3650-3594-404e-bd6c-adcc2d2cc0bf\") " pod="openstack/mariadb-client-2" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.277864 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a4f66dd843c1cf8938fe3ce09cf8c4678177b0e59094c8edfd2770c27169cd" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.277968 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.323934 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 08:11:24 crc kubenswrapper[4732]: I1010 08:11:24.884904 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 08:11:24 crc kubenswrapper[4732]: W1010 08:11:24.898608 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc42e3650_3594_404e_bd6c_adcc2d2cc0bf.slice/crio-1c5246bbbbc5a9c5d4f39929105ffce605b5d486d3cc70187ca1059b6617b18a WatchSource:0}: Error finding container 1c5246bbbbc5a9c5d4f39929105ffce605b5d486d3cc70187ca1059b6617b18a: Status 404 returned error can't find the container with id 1c5246bbbbc5a9c5d4f39929105ffce605b5d486d3cc70187ca1059b6617b18a Oct 10 08:11:25 crc kubenswrapper[4732]: I1010 08:11:25.295261 4732 generic.go:334] "Generic (PLEG): container finished" podID="c42e3650-3594-404e-bd6c-adcc2d2cc0bf" containerID="be0d3e89eaf1e00cf40e929f2f77727a5d41dea8f54d52917e4a2026af0efef2" exitCode=0 Oct 10 08:11:25 crc kubenswrapper[4732]: I1010 08:11:25.295509 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"c42e3650-3594-404e-bd6c-adcc2d2cc0bf","Type":"ContainerDied","Data":"be0d3e89eaf1e00cf40e929f2f77727a5d41dea8f54d52917e4a2026af0efef2"} Oct 10 08:11:25 crc kubenswrapper[4732]: I1010 08:11:25.295737 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"c42e3650-3594-404e-bd6c-adcc2d2cc0bf","Type":"ContainerStarted","Data":"1c5246bbbbc5a9c5d4f39929105ffce605b5d486d3cc70187ca1059b6617b18a"} Oct 10 08:11:25 crc kubenswrapper[4732]: I1010 08:11:25.676465 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="363e9216-b2c6-4e45-9347-617ee01529f4" path="/var/lib/kubelet/pods/363e9216-b2c6-4e45-9347-617ee01529f4/volumes" Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.014913 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.036987 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_c42e3650-3594-404e-bd6c-adcc2d2cc0bf/mariadb-client-2/0.log" Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.069799 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.078664 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.172962 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbvlj\" (UniqueName: \"kubernetes.io/projected/c42e3650-3594-404e-bd6c-adcc2d2cc0bf-kube-api-access-pbvlj\") pod \"c42e3650-3594-404e-bd6c-adcc2d2cc0bf\" (UID: \"c42e3650-3594-404e-bd6c-adcc2d2cc0bf\") " Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.181341 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c42e3650-3594-404e-bd6c-adcc2d2cc0bf-kube-api-access-pbvlj" (OuterVolumeSpecName: "kube-api-access-pbvlj") pod "c42e3650-3594-404e-bd6c-adcc2d2cc0bf" (UID: "c42e3650-3594-404e-bd6c-adcc2d2cc0bf"). InnerVolumeSpecName "kube-api-access-pbvlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.275438 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbvlj\" (UniqueName: \"kubernetes.io/projected/c42e3650-3594-404e-bd6c-adcc2d2cc0bf-kube-api-access-pbvlj\") on node \"crc\" DevicePath \"\"" Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.316463 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c5246bbbbc5a9c5d4f39929105ffce605b5d486d3cc70187ca1059b6617b18a" Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.316730 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 10 08:11:27 crc kubenswrapper[4732]: I1010 08:11:27.674442 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c42e3650-3594-404e-bd6c-adcc2d2cc0bf" path="/var/lib/kubelet/pods/c42e3650-3594-404e-bd6c-adcc2d2cc0bf/volumes" Oct 10 08:12:24 crc kubenswrapper[4732]: I1010 08:12:24.988456 4732 scope.go:117] "RemoveContainer" containerID="781dd58a85ce736f53bc98a1bceb9500f57c71f032f9178b9dd187cf9d626c91" Oct 10 08:12:25 crc kubenswrapper[4732]: I1010 08:12:25.355937 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:12:25 crc kubenswrapper[4732]: I1010 08:12:25.355997 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:12:40 crc kubenswrapper[4732]: I1010 08:12:40.969408 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-skk52"] Oct 10 08:12:40 crc kubenswrapper[4732]: E1010 08:12:40.970367 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c42e3650-3594-404e-bd6c-adcc2d2cc0bf" containerName="mariadb-client-2" Oct 10 08:12:40 crc kubenswrapper[4732]: I1010 08:12:40.970385 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c42e3650-3594-404e-bd6c-adcc2d2cc0bf" containerName="mariadb-client-2" Oct 10 08:12:40 crc kubenswrapper[4732]: I1010 08:12:40.970574 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c42e3650-3594-404e-bd6c-adcc2d2cc0bf" containerName="mariadb-client-2" Oct 10 08:12:40 crc kubenswrapper[4732]: I1010 08:12:40.971786 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:40 crc kubenswrapper[4732]: I1010 08:12:40.997992 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-skk52"] Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.166029 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqhl\" (UniqueName: \"kubernetes.io/projected/5515e2cd-900d-43df-a0a8-df91f3e8d70f-kube-api-access-zkqhl\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.166663 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-utilities\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.166736 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-catalog-content\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.268146 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqhl\" (UniqueName: \"kubernetes.io/projected/5515e2cd-900d-43df-a0a8-df91f3e8d70f-kube-api-access-zkqhl\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.268244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-utilities\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.268290 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-catalog-content\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.268685 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-utilities\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.268887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-catalog-content\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.287848 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqhl\" (UniqueName: \"kubernetes.io/projected/5515e2cd-900d-43df-a0a8-df91f3e8d70f-kube-api-access-zkqhl\") pod \"certified-operators-skk52\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.310098 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:41 crc kubenswrapper[4732]: I1010 08:12:41.802342 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-skk52"] Oct 10 08:12:42 crc kubenswrapper[4732]: I1010 08:12:42.088751 4732 generic.go:334] "Generic (PLEG): container finished" podID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerID="193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976" exitCode=0 Oct 10 08:12:42 crc kubenswrapper[4732]: I1010 08:12:42.088805 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skk52" event={"ID":"5515e2cd-900d-43df-a0a8-df91f3e8d70f","Type":"ContainerDied","Data":"193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976"} Oct 10 08:12:42 crc kubenswrapper[4732]: I1010 08:12:42.088856 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skk52" event={"ID":"5515e2cd-900d-43df-a0a8-df91f3e8d70f","Type":"ContainerStarted","Data":"44cfe77bcf3d81bd1d50e91d6681d7a9f04633ca3037c3f945674b7ee302e9b8"} Oct 10 08:12:43 crc kubenswrapper[4732]: I1010 08:12:43.103422 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skk52" event={"ID":"5515e2cd-900d-43df-a0a8-df91f3e8d70f","Type":"ContainerStarted","Data":"13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f"} Oct 10 08:12:44 crc kubenswrapper[4732]: I1010 08:12:44.115656 4732 generic.go:334] "Generic (PLEG): container finished" podID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerID="13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f" exitCode=0 Oct 10 08:12:44 crc kubenswrapper[4732]: I1010 08:12:44.115758 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skk52" event={"ID":"5515e2cd-900d-43df-a0a8-df91f3e8d70f","Type":"ContainerDied","Data":"13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f"} Oct 10 08:12:45 crc kubenswrapper[4732]: I1010 08:12:45.128894 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skk52" event={"ID":"5515e2cd-900d-43df-a0a8-df91f3e8d70f","Type":"ContainerStarted","Data":"91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529"} Oct 10 08:12:45 crc kubenswrapper[4732]: I1010 08:12:45.162249 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-skk52" podStartSLOduration=2.684284793 podStartE2EDuration="5.162224141s" podCreationTimestamp="2025-10-10 08:12:40 +0000 UTC" firstStartedPulling="2025-10-10 08:12:42.091388379 +0000 UTC m=+4889.160979620" lastFinishedPulling="2025-10-10 08:12:44.569327727 +0000 UTC m=+4891.638918968" observedRunningTime="2025-10-10 08:12:45.155551731 +0000 UTC m=+4892.225142982" watchObservedRunningTime="2025-10-10 08:12:45.162224141 +0000 UTC m=+4892.231815412" Oct 10 08:12:51 crc kubenswrapper[4732]: I1010 08:12:51.310882 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:51 crc kubenswrapper[4732]: I1010 08:12:51.311545 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:51 crc kubenswrapper[4732]: I1010 08:12:51.365392 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:52 crc kubenswrapper[4732]: I1010 08:12:52.236442 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:52 crc kubenswrapper[4732]: I1010 08:12:52.282791 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-skk52"] Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.207076 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-skk52" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="registry-server" containerID="cri-o://91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529" gracePeriod=2 Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.631904 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.817499 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkqhl\" (UniqueName: \"kubernetes.io/projected/5515e2cd-900d-43df-a0a8-df91f3e8d70f-kube-api-access-zkqhl\") pod \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.817578 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-catalog-content\") pod \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.817675 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-utilities\") pod \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\" (UID: \"5515e2cd-900d-43df-a0a8-df91f3e8d70f\") " Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.819458 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-utilities" (OuterVolumeSpecName: "utilities") pod "5515e2cd-900d-43df-a0a8-df91f3e8d70f" (UID: "5515e2cd-900d-43df-a0a8-df91f3e8d70f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.826457 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5515e2cd-900d-43df-a0a8-df91f3e8d70f-kube-api-access-zkqhl" (OuterVolumeSpecName: "kube-api-access-zkqhl") pod "5515e2cd-900d-43df-a0a8-df91f3e8d70f" (UID: "5515e2cd-900d-43df-a0a8-df91f3e8d70f"). InnerVolumeSpecName "kube-api-access-zkqhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.896351 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5515e2cd-900d-43df-a0a8-df91f3e8d70f" (UID: "5515e2cd-900d-43df-a0a8-df91f3e8d70f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.919685 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkqhl\" (UniqueName: \"kubernetes.io/projected/5515e2cd-900d-43df-a0a8-df91f3e8d70f-kube-api-access-zkqhl\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.919919 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:54 crc kubenswrapper[4732]: I1010 08:12:54.919949 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5515e2cd-900d-43df-a0a8-df91f3e8d70f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.220445 4732 generic.go:334] "Generic (PLEG): container finished" podID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerID="91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529" exitCode=0 Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.220500 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skk52" event={"ID":"5515e2cd-900d-43df-a0a8-df91f3e8d70f","Type":"ContainerDied","Data":"91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529"} Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.220543 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skk52" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.220697 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skk52" event={"ID":"5515e2cd-900d-43df-a0a8-df91f3e8d70f","Type":"ContainerDied","Data":"44cfe77bcf3d81bd1d50e91d6681d7a9f04633ca3037c3f945674b7ee302e9b8"} Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.220729 4732 scope.go:117] "RemoveContainer" containerID="91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.248477 4732 scope.go:117] "RemoveContainer" containerID="13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.287682 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-skk52"] Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.292724 4732 scope.go:117] "RemoveContainer" containerID="193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.302200 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-skk52"] Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.314645 4732 scope.go:117] "RemoveContainer" containerID="91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529" Oct 10 08:12:55 crc kubenswrapper[4732]: E1010 08:12:55.315216 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529\": container with ID starting with 91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529 not found: ID does not exist" containerID="91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.315283 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529"} err="failed to get container status \"91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529\": rpc error: code = NotFound desc = could not find container \"91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529\": container with ID starting with 91db1129f44bf814bdc54a5349f1feb339b197dff6a424b11bc8d5d5281a7529 not found: ID does not exist" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.315315 4732 scope.go:117] "RemoveContainer" containerID="13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f" Oct 10 08:12:55 crc kubenswrapper[4732]: E1010 08:12:55.315745 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f\": container with ID starting with 13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f not found: ID does not exist" containerID="13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.315789 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f"} err="failed to get container status \"13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f\": rpc error: code = NotFound desc = could not find container \"13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f\": container with ID starting with 13eadfe849ae8038aa9d0c5d18662e39ba1c45299a75e3d970adb469151a1b4f not found: ID does not exist" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.315815 4732 scope.go:117] "RemoveContainer" containerID="193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976" Oct 10 08:12:55 crc kubenswrapper[4732]: E1010 08:12:55.316258 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976\": container with ID starting with 193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976 not found: ID does not exist" containerID="193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.316409 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976"} err="failed to get container status \"193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976\": rpc error: code = NotFound desc = could not find container \"193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976\": container with ID starting with 193c1fa72a2cb027c38b410f4beb4add5ccee49f4391e52148d3208e302a9976 not found: ID does not exist" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.355981 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.356039 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:12:55 crc kubenswrapper[4732]: I1010 08:12:55.669609 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" path="/var/lib/kubelet/pods/5515e2cd-900d-43df-a0a8-df91f3e8d70f/volumes" Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.356544 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.356994 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.357032 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.357585 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.357632 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" gracePeriod=600 Oct 10 08:13:25 crc kubenswrapper[4732]: E1010 08:13:25.482886 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.524496 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" exitCode=0 Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.524533 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea"} Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.524579 4732 scope.go:117] "RemoveContainer" containerID="414b67626caa2a3a0941b29c620d3e844d3d0e2537b4a836117644eb6baed081" Oct 10 08:13:25 crc kubenswrapper[4732]: I1010 08:13:25.525628 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:13:25 crc kubenswrapper[4732]: E1010 08:13:25.525959 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:13:37 crc kubenswrapper[4732]: E1010 08:13:37.928004 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Oct 10 08:13:38 crc kubenswrapper[4732]: I1010 08:13:38.660582 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:13:38 crc kubenswrapper[4732]: E1010 08:13:38.660912 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.458818 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xhwvv"] Oct 10 08:13:42 crc kubenswrapper[4732]: E1010 08:13:42.459609 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="extract-utilities" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.459625 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="extract-utilities" Oct 10 08:13:42 crc kubenswrapper[4732]: E1010 08:13:42.459644 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="extract-content" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.459651 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="extract-content" Oct 10 08:13:42 crc kubenswrapper[4732]: E1010 08:13:42.459672 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="registry-server" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.459680 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="registry-server" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.460118 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5515e2cd-900d-43df-a0a8-df91f3e8d70f" containerName="registry-server" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.461647 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.474286 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xhwvv"] Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.558237 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-catalog-content\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.558309 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-utilities\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.558682 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlqz9\" (UniqueName: \"kubernetes.io/projected/c79c5074-7f22-42e9-a766-bb5f0b2451c0-kube-api-access-vlqz9\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.660714 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-catalog-content\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.660773 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-utilities\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.661509 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-utilities\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.661519 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-catalog-content\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.661349 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlqz9\" (UniqueName: \"kubernetes.io/projected/c79c5074-7f22-42e9-a766-bb5f0b2451c0-kube-api-access-vlqz9\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.686674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlqz9\" (UniqueName: \"kubernetes.io/projected/c79c5074-7f22-42e9-a766-bb5f0b2451c0-kube-api-access-vlqz9\") pod \"community-operators-xhwvv\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:42 crc kubenswrapper[4732]: I1010 08:13:42.794288 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:43 crc kubenswrapper[4732]: I1010 08:13:43.529057 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xhwvv"] Oct 10 08:13:43 crc kubenswrapper[4732]: W1010 08:13:43.538287 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79c5074_7f22_42e9_a766_bb5f0b2451c0.slice/crio-62d73ded657dd2eac44f2e550f0e49eeae23629dd0eeef347dfcf44bf8e1b8ae WatchSource:0}: Error finding container 62d73ded657dd2eac44f2e550f0e49eeae23629dd0eeef347dfcf44bf8e1b8ae: Status 404 returned error can't find the container with id 62d73ded657dd2eac44f2e550f0e49eeae23629dd0eeef347dfcf44bf8e1b8ae Oct 10 08:13:43 crc kubenswrapper[4732]: I1010 08:13:43.710100 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhwvv" event={"ID":"c79c5074-7f22-42e9-a766-bb5f0b2451c0","Type":"ContainerStarted","Data":"62d73ded657dd2eac44f2e550f0e49eeae23629dd0eeef347dfcf44bf8e1b8ae"} Oct 10 08:13:44 crc kubenswrapper[4732]: I1010 08:13:44.724978 4732 generic.go:334] "Generic (PLEG): container finished" podID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerID="4e77e523ea519001cb9b7fac2d5e128f39ee1f7fcd373c6018df6abf29d9821c" exitCode=0 Oct 10 08:13:44 crc kubenswrapper[4732]: I1010 08:13:44.725057 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhwvv" event={"ID":"c79c5074-7f22-42e9-a766-bb5f0b2451c0","Type":"ContainerDied","Data":"4e77e523ea519001cb9b7fac2d5e128f39ee1f7fcd373c6018df6abf29d9821c"} Oct 10 08:13:44 crc kubenswrapper[4732]: I1010 08:13:44.728582 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:13:45 crc kubenswrapper[4732]: I1010 08:13:45.735774 4732 generic.go:334] "Generic (PLEG): container finished" podID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerID="bb0bebf7c7e500a642da1b279f14f0e83c07dc42c479c5680e2d4d35c9b92026" exitCode=0 Oct 10 08:13:45 crc kubenswrapper[4732]: I1010 08:13:45.735879 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhwvv" event={"ID":"c79c5074-7f22-42e9-a766-bb5f0b2451c0","Type":"ContainerDied","Data":"bb0bebf7c7e500a642da1b279f14f0e83c07dc42c479c5680e2d4d35c9b92026"} Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.031976 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-98qfb"] Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.035268 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.047195 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98qfb"] Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.115799 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59jnt\" (UniqueName: \"kubernetes.io/projected/35299aeb-f133-4fef-bb21-0b3786d8ea08-kube-api-access-59jnt\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.115901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-utilities\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.116057 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-catalog-content\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.217433 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59jnt\" (UniqueName: \"kubernetes.io/projected/35299aeb-f133-4fef-bb21-0b3786d8ea08-kube-api-access-59jnt\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.217814 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-utilities\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.217859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-catalog-content\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.218261 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-utilities\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.218345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-catalog-content\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.237463 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59jnt\" (UniqueName: \"kubernetes.io/projected/35299aeb-f133-4fef-bb21-0b3786d8ea08-kube-api-access-59jnt\") pod \"redhat-operators-98qfb\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.362137 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.600411 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98qfb"] Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.743210 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98qfb" event={"ID":"35299aeb-f133-4fef-bb21-0b3786d8ea08","Type":"ContainerStarted","Data":"fa2af40bd9d3c0fdfc7ac87bf9abadecb44e04b21fe3898cee49dd81d279976b"} Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.745798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhwvv" event={"ID":"c79c5074-7f22-42e9-a766-bb5f0b2451c0","Type":"ContainerStarted","Data":"307e51bedfc6323b4b22d27d0ca5a9b35b5fb59e454d2e92e1b76a992141177b"} Oct 10 08:13:46 crc kubenswrapper[4732]: I1010 08:13:46.766848 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xhwvv" podStartSLOduration=3.322931543 podStartE2EDuration="4.766832315s" podCreationTimestamp="2025-10-10 08:13:42 +0000 UTC" firstStartedPulling="2025-10-10 08:13:44.728059065 +0000 UTC m=+4951.797650336" lastFinishedPulling="2025-10-10 08:13:46.171959867 +0000 UTC m=+4953.241551108" observedRunningTime="2025-10-10 08:13:46.762302562 +0000 UTC m=+4953.831893823" watchObservedRunningTime="2025-10-10 08:13:46.766832315 +0000 UTC m=+4953.836423556" Oct 10 08:13:47 crc kubenswrapper[4732]: I1010 08:13:47.754500 4732 generic.go:334] "Generic (PLEG): container finished" podID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerID="12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de" exitCode=0 Oct 10 08:13:47 crc kubenswrapper[4732]: I1010 08:13:47.754557 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98qfb" event={"ID":"35299aeb-f133-4fef-bb21-0b3786d8ea08","Type":"ContainerDied","Data":"12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de"} Oct 10 08:13:48 crc kubenswrapper[4732]: I1010 08:13:48.768321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98qfb" event={"ID":"35299aeb-f133-4fef-bb21-0b3786d8ea08","Type":"ContainerStarted","Data":"e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98"} Oct 10 08:13:49 crc kubenswrapper[4732]: I1010 08:13:49.779933 4732 generic.go:334] "Generic (PLEG): container finished" podID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerID="e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98" exitCode=0 Oct 10 08:13:49 crc kubenswrapper[4732]: I1010 08:13:49.780055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98qfb" event={"ID":"35299aeb-f133-4fef-bb21-0b3786d8ea08","Type":"ContainerDied","Data":"e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98"} Oct 10 08:13:50 crc kubenswrapper[4732]: I1010 08:13:50.793598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98qfb" event={"ID":"35299aeb-f133-4fef-bb21-0b3786d8ea08","Type":"ContainerStarted","Data":"1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7"} Oct 10 08:13:52 crc kubenswrapper[4732]: I1010 08:13:52.660666 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:13:52 crc kubenswrapper[4732]: E1010 08:13:52.661277 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:13:52 crc kubenswrapper[4732]: I1010 08:13:52.804722 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:52 crc kubenswrapper[4732]: I1010 08:13:52.804814 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:52 crc kubenswrapper[4732]: I1010 08:13:52.862802 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:52 crc kubenswrapper[4732]: I1010 08:13:52.883536 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-98qfb" podStartSLOduration=4.459383337 podStartE2EDuration="6.883519412s" podCreationTimestamp="2025-10-10 08:13:46 +0000 UTC" firstStartedPulling="2025-10-10 08:13:47.757003446 +0000 UTC m=+4954.826594687" lastFinishedPulling="2025-10-10 08:13:50.181139511 +0000 UTC m=+4957.250730762" observedRunningTime="2025-10-10 08:13:50.822299628 +0000 UTC m=+4957.891890929" watchObservedRunningTime="2025-10-10 08:13:52.883519412 +0000 UTC m=+4959.953110663" Oct 10 08:13:53 crc kubenswrapper[4732]: I1010 08:13:53.897833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:55 crc kubenswrapper[4732]: I1010 08:13:55.021412 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xhwvv"] Oct 10 08:13:55 crc kubenswrapper[4732]: I1010 08:13:55.841127 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xhwvv" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="registry-server" containerID="cri-o://307e51bedfc6323b4b22d27d0ca5a9b35b5fb59e454d2e92e1b76a992141177b" gracePeriod=2 Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.362564 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.362634 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.432795 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.856225 4732 generic.go:334] "Generic (PLEG): container finished" podID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerID="307e51bedfc6323b4b22d27d0ca5a9b35b5fb59e454d2e92e1b76a992141177b" exitCode=0 Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.856325 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhwvv" event={"ID":"c79c5074-7f22-42e9-a766-bb5f0b2451c0","Type":"ContainerDied","Data":"307e51bedfc6323b4b22d27d0ca5a9b35b5fb59e454d2e92e1b76a992141177b"} Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.856721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xhwvv" event={"ID":"c79c5074-7f22-42e9-a766-bb5f0b2451c0","Type":"ContainerDied","Data":"62d73ded657dd2eac44f2e550f0e49eeae23629dd0eeef347dfcf44bf8e1b8ae"} Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.856736 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62d73ded657dd2eac44f2e550f0e49eeae23629dd0eeef347dfcf44bf8e1b8ae" Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.894899 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.920015 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.986314 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-catalog-content\") pod \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.986646 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-utilities\") pod \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.986870 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlqz9\" (UniqueName: \"kubernetes.io/projected/c79c5074-7f22-42e9-a766-bb5f0b2451c0-kube-api-access-vlqz9\") pod \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\" (UID: \"c79c5074-7f22-42e9-a766-bb5f0b2451c0\") " Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.988275 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-utilities" (OuterVolumeSpecName: "utilities") pod "c79c5074-7f22-42e9-a766-bb5f0b2451c0" (UID: "c79c5074-7f22-42e9-a766-bb5f0b2451c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:13:56 crc kubenswrapper[4732]: I1010 08:13:56.995930 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79c5074-7f22-42e9-a766-bb5f0b2451c0-kube-api-access-vlqz9" (OuterVolumeSpecName: "kube-api-access-vlqz9") pod "c79c5074-7f22-42e9-a766-bb5f0b2451c0" (UID: "c79c5074-7f22-42e9-a766-bb5f0b2451c0"). InnerVolumeSpecName "kube-api-access-vlqz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.037847 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c79c5074-7f22-42e9-a766-bb5f0b2451c0" (UID: "c79c5074-7f22-42e9-a766-bb5f0b2451c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.089897 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.089933 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c79c5074-7f22-42e9-a766-bb5f0b2451c0-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.089944 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlqz9\" (UniqueName: \"kubernetes.io/projected/c79c5074-7f22-42e9-a766-bb5f0b2451c0-kube-api-access-vlqz9\") on node \"crc\" DevicePath \"\"" Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.817769 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98qfb"] Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.866924 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xhwvv" Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.895012 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xhwvv"] Oct 10 08:13:57 crc kubenswrapper[4732]: I1010 08:13:57.904462 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xhwvv"] Oct 10 08:13:58 crc kubenswrapper[4732]: I1010 08:13:58.876976 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-98qfb" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="registry-server" containerID="cri-o://1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7" gracePeriod=2 Oct 10 08:13:59 crc kubenswrapper[4732]: I1010 08:13:59.676403 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" path="/var/lib/kubelet/pods/c79c5074-7f22-42e9-a766-bb5f0b2451c0/volumes" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.511910 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.648414 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-catalog-content\") pod \"35299aeb-f133-4fef-bb21-0b3786d8ea08\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.648536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59jnt\" (UniqueName: \"kubernetes.io/projected/35299aeb-f133-4fef-bb21-0b3786d8ea08-kube-api-access-59jnt\") pod \"35299aeb-f133-4fef-bb21-0b3786d8ea08\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.648727 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-utilities\") pod \"35299aeb-f133-4fef-bb21-0b3786d8ea08\" (UID: \"35299aeb-f133-4fef-bb21-0b3786d8ea08\") " Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.650603 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-utilities" (OuterVolumeSpecName: "utilities") pod "35299aeb-f133-4fef-bb21-0b3786d8ea08" (UID: "35299aeb-f133-4fef-bb21-0b3786d8ea08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.656608 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35299aeb-f133-4fef-bb21-0b3786d8ea08-kube-api-access-59jnt" (OuterVolumeSpecName: "kube-api-access-59jnt") pod "35299aeb-f133-4fef-bb21-0b3786d8ea08" (UID: "35299aeb-f133-4fef-bb21-0b3786d8ea08"). InnerVolumeSpecName "kube-api-access-59jnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.726269 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35299aeb-f133-4fef-bb21-0b3786d8ea08" (UID: "35299aeb-f133-4fef-bb21-0b3786d8ea08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.751021 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.751057 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35299aeb-f133-4fef-bb21-0b3786d8ea08-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.751069 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59jnt\" (UniqueName: \"kubernetes.io/projected/35299aeb-f133-4fef-bb21-0b3786d8ea08-kube-api-access-59jnt\") on node \"crc\" DevicePath \"\"" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.897289 4732 generic.go:334] "Generic (PLEG): container finished" podID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerID="1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7" exitCode=0 Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.897384 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98qfb" event={"ID":"35299aeb-f133-4fef-bb21-0b3786d8ea08","Type":"ContainerDied","Data":"1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7"} Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.897456 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98qfb" event={"ID":"35299aeb-f133-4fef-bb21-0b3786d8ea08","Type":"ContainerDied","Data":"fa2af40bd9d3c0fdfc7ac87bf9abadecb44e04b21fe3898cee49dd81d279976b"} Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.897506 4732 scope.go:117] "RemoveContainer" containerID="1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.897616 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98qfb" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.931339 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98qfb"] Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.938115 4732 scope.go:117] "RemoveContainer" containerID="e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98" Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.939649 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-98qfb"] Oct 10 08:14:00 crc kubenswrapper[4732]: I1010 08:14:00.961394 4732 scope.go:117] "RemoveContainer" containerID="12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de" Oct 10 08:14:01 crc kubenswrapper[4732]: I1010 08:14:01.004276 4732 scope.go:117] "RemoveContainer" containerID="1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7" Oct 10 08:14:01 crc kubenswrapper[4732]: E1010 08:14:01.004916 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7\": container with ID starting with 1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7 not found: ID does not exist" containerID="1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7" Oct 10 08:14:01 crc kubenswrapper[4732]: I1010 08:14:01.004982 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7"} err="failed to get container status \"1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7\": rpc error: code = NotFound desc = could not find container \"1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7\": container with ID starting with 1344e415d3f5c8fe7de4bd64dd3152a490c962493d0c5cda72d812ddb5290ae7 not found: ID does not exist" Oct 10 08:14:01 crc kubenswrapper[4732]: I1010 08:14:01.005015 4732 scope.go:117] "RemoveContainer" containerID="e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98" Oct 10 08:14:01 crc kubenswrapper[4732]: E1010 08:14:01.005788 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98\": container with ID starting with e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98 not found: ID does not exist" containerID="e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98" Oct 10 08:14:01 crc kubenswrapper[4732]: I1010 08:14:01.005831 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98"} err="failed to get container status \"e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98\": rpc error: code = NotFound desc = could not find container \"e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98\": container with ID starting with e38d3355f5046fd692f903c107056807a738fdc192188710fdecb264d69e1d98 not found: ID does not exist" Oct 10 08:14:01 crc kubenswrapper[4732]: I1010 08:14:01.005851 4732 scope.go:117] "RemoveContainer" containerID="12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de" Oct 10 08:14:01 crc kubenswrapper[4732]: E1010 08:14:01.006192 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de\": container with ID starting with 12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de not found: ID does not exist" containerID="12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de" Oct 10 08:14:01 crc kubenswrapper[4732]: I1010 08:14:01.006226 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de"} err="failed to get container status \"12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de\": rpc error: code = NotFound desc = could not find container \"12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de\": container with ID starting with 12c175c4554bdc091ec5bd3eee700ab58c2ed04fc801080df7907a8f543899de not found: ID does not exist" Oct 10 08:14:01 crc kubenswrapper[4732]: I1010 08:14:01.677935 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" path="/var/lib/kubelet/pods/35299aeb-f133-4fef-bb21-0b3786d8ea08/volumes" Oct 10 08:14:05 crc kubenswrapper[4732]: I1010 08:14:05.662308 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:14:05 crc kubenswrapper[4732]: E1010 08:14:05.663477 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.568648 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 08:14:16 crc kubenswrapper[4732]: E1010 08:14:16.569992 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="extract-content" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570035 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="extract-content" Oct 10 08:14:16 crc kubenswrapper[4732]: E1010 08:14:16.570059 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="extract-utilities" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570075 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="extract-utilities" Oct 10 08:14:16 crc kubenswrapper[4732]: E1010 08:14:16.570102 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="extract-content" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570117 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="extract-content" Oct 10 08:14:16 crc kubenswrapper[4732]: E1010 08:14:16.570131 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="registry-server" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570143 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="registry-server" Oct 10 08:14:16 crc kubenswrapper[4732]: E1010 08:14:16.570160 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="extract-utilities" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570172 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="extract-utilities" Oct 10 08:14:16 crc kubenswrapper[4732]: E1010 08:14:16.570203 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="registry-server" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570215 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="registry-server" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570469 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="35299aeb-f133-4fef-bb21-0b3786d8ea08" containerName="registry-server" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.570494 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79c5074-7f22-42e9-a766-bb5f0b2451c0" containerName="registry-server" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.571444 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.574950 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5ddsv" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.586004 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.722375 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\") pod \"mariadb-copy-data\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.722503 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5jp\" (UniqueName: \"kubernetes.io/projected/ff596335-f475-49d2-8479-da5e2797ac5e-kube-api-access-zc5jp\") pod \"mariadb-copy-data\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.824860 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5jp\" (UniqueName: \"kubernetes.io/projected/ff596335-f475-49d2-8479-da5e2797ac5e-kube-api-access-zc5jp\") pod \"mariadb-copy-data\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.825492 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\") pod \"mariadb-copy-data\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.830278 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.830346 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\") pod \"mariadb-copy-data\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1676e14c2ae9a489243b3a190efa99ea392eb82cdf191c56146e1cfe237cc443/globalmount\"" pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.863124 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5jp\" (UniqueName: \"kubernetes.io/projected/ff596335-f475-49d2-8479-da5e2797ac5e-kube-api-access-zc5jp\") pod \"mariadb-copy-data\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.869161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\") pod \"mariadb-copy-data\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " pod="openstack/mariadb-copy-data" Oct 10 08:14:16 crc kubenswrapper[4732]: I1010 08:14:16.905423 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 08:14:17 crc kubenswrapper[4732]: I1010 08:14:17.253665 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 08:14:18 crc kubenswrapper[4732]: I1010 08:14:18.080440 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ff596335-f475-49d2-8479-da5e2797ac5e","Type":"ContainerStarted","Data":"250a4aa792ae89c51fcd8a699da9f73825467dd962d0f33aa86cc54f8513ecb4"} Oct 10 08:14:18 crc kubenswrapper[4732]: I1010 08:14:18.080521 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ff596335-f475-49d2-8479-da5e2797ac5e","Type":"ContainerStarted","Data":"110fabd8a502e1f632bdfb473bc05a6fdf6c4ed69decbb61dcc231f1dcd7a460"} Oct 10 08:14:18 crc kubenswrapper[4732]: I1010 08:14:18.107861 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.107827461 podStartE2EDuration="3.107827461s" podCreationTimestamp="2025-10-10 08:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:14:18.09852659 +0000 UTC m=+4985.168117891" watchObservedRunningTime="2025-10-10 08:14:18.107827461 +0000 UTC m=+4985.177418732" Oct 10 08:14:18 crc kubenswrapper[4732]: I1010 08:14:18.660602 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:14:18 crc kubenswrapper[4732]: E1010 08:14:18.660982 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.004243 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.006780 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.018614 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.096167 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5brdb\" (UniqueName: \"kubernetes.io/projected/56b201c3-e518-4d8b-8ab1-f4f3282ce836-kube-api-access-5brdb\") pod \"mariadb-client\" (UID: \"56b201c3-e518-4d8b-8ab1-f4f3282ce836\") " pod="openstack/mariadb-client" Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.197554 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5brdb\" (UniqueName: \"kubernetes.io/projected/56b201c3-e518-4d8b-8ab1-f4f3282ce836-kube-api-access-5brdb\") pod \"mariadb-client\" (UID: \"56b201c3-e518-4d8b-8ab1-f4f3282ce836\") " pod="openstack/mariadb-client" Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.238892 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5brdb\" (UniqueName: \"kubernetes.io/projected/56b201c3-e518-4d8b-8ab1-f4f3282ce836-kube-api-access-5brdb\") pod \"mariadb-client\" (UID: \"56b201c3-e518-4d8b-8ab1-f4f3282ce836\") " pod="openstack/mariadb-client" Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.339930 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:20 crc kubenswrapper[4732]: I1010 08:14:20.808299 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:20 crc kubenswrapper[4732]: W1010 08:14:20.818559 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56b201c3_e518_4d8b_8ab1_f4f3282ce836.slice/crio-a4f3f7210b77b656124a7f130cab80cf9cc766efcfb27917c0a4f7bcdd5b2df8 WatchSource:0}: Error finding container a4f3f7210b77b656124a7f130cab80cf9cc766efcfb27917c0a4f7bcdd5b2df8: Status 404 returned error can't find the container with id a4f3f7210b77b656124a7f130cab80cf9cc766efcfb27917c0a4f7bcdd5b2df8 Oct 10 08:14:21 crc kubenswrapper[4732]: I1010 08:14:21.112617 4732 generic.go:334] "Generic (PLEG): container finished" podID="56b201c3-e518-4d8b-8ab1-f4f3282ce836" containerID="4ef2ee8a6a9bdc571ac397badaea08db591968935e4dd3dbce15a9b1ea20f500" exitCode=0 Oct 10 08:14:21 crc kubenswrapper[4732]: I1010 08:14:21.112789 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56b201c3-e518-4d8b-8ab1-f4f3282ce836","Type":"ContainerDied","Data":"4ef2ee8a6a9bdc571ac397badaea08db591968935e4dd3dbce15a9b1ea20f500"} Oct 10 08:14:21 crc kubenswrapper[4732]: I1010 08:14:21.112941 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"56b201c3-e518-4d8b-8ab1-f4f3282ce836","Type":"ContainerStarted","Data":"a4f3f7210b77b656124a7f130cab80cf9cc766efcfb27917c0a4f7bcdd5b2df8"} Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.498251 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.525276 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_56b201c3-e518-4d8b-8ab1-f4f3282ce836/mariadb-client/0.log" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.553939 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.560385 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.647473 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5brdb\" (UniqueName: \"kubernetes.io/projected/56b201c3-e518-4d8b-8ab1-f4f3282ce836-kube-api-access-5brdb\") pod \"56b201c3-e518-4d8b-8ab1-f4f3282ce836\" (UID: \"56b201c3-e518-4d8b-8ab1-f4f3282ce836\") " Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.657073 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56b201c3-e518-4d8b-8ab1-f4f3282ce836-kube-api-access-5brdb" (OuterVolumeSpecName: "kube-api-access-5brdb") pod "56b201c3-e518-4d8b-8ab1-f4f3282ce836" (UID: "56b201c3-e518-4d8b-8ab1-f4f3282ce836"). InnerVolumeSpecName "kube-api-access-5brdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.731900 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:22 crc kubenswrapper[4732]: E1010 08:14:22.732587 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56b201c3-e518-4d8b-8ab1-f4f3282ce836" containerName="mariadb-client" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.732608 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56b201c3-e518-4d8b-8ab1-f4f3282ce836" containerName="mariadb-client" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.732950 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56b201c3-e518-4d8b-8ab1-f4f3282ce836" containerName="mariadb-client" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.733562 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.741338 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.769492 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5brdb\" (UniqueName: \"kubernetes.io/projected/56b201c3-e518-4d8b-8ab1-f4f3282ce836-kube-api-access-5brdb\") on node \"crc\" DevicePath \"\"" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.871544 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glch9\" (UniqueName: \"kubernetes.io/projected/62a92e5a-e387-4ebf-b610-f244c9c9bf7e-kube-api-access-glch9\") pod \"mariadb-client\" (UID: \"62a92e5a-e387-4ebf-b610-f244c9c9bf7e\") " pod="openstack/mariadb-client" Oct 10 08:14:22 crc kubenswrapper[4732]: I1010 08:14:22.972760 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glch9\" (UniqueName: \"kubernetes.io/projected/62a92e5a-e387-4ebf-b610-f244c9c9bf7e-kube-api-access-glch9\") pod \"mariadb-client\" (UID: \"62a92e5a-e387-4ebf-b610-f244c9c9bf7e\") " pod="openstack/mariadb-client" Oct 10 08:14:23 crc kubenswrapper[4732]: I1010 08:14:23.000821 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glch9\" (UniqueName: \"kubernetes.io/projected/62a92e5a-e387-4ebf-b610-f244c9c9bf7e-kube-api-access-glch9\") pod \"mariadb-client\" (UID: \"62a92e5a-e387-4ebf-b610-f244c9c9bf7e\") " pod="openstack/mariadb-client" Oct 10 08:14:23 crc kubenswrapper[4732]: I1010 08:14:23.075116 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:23 crc kubenswrapper[4732]: I1010 08:14:23.147880 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f3f7210b77b656124a7f130cab80cf9cc766efcfb27917c0a4f7bcdd5b2df8" Oct 10 08:14:23 crc kubenswrapper[4732]: I1010 08:14:23.148373 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:23 crc kubenswrapper[4732]: I1010 08:14:23.183023 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="56b201c3-e518-4d8b-8ab1-f4f3282ce836" podUID="62a92e5a-e387-4ebf-b610-f244c9c9bf7e" Oct 10 08:14:23 crc kubenswrapper[4732]: I1010 08:14:23.335201 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:23 crc kubenswrapper[4732]: W1010 08:14:23.339777 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62a92e5a_e387_4ebf_b610_f244c9c9bf7e.slice/crio-3183ec41558f6bf9a3f893811d8238cbfb5cc3ca87f7761fb15edcdd7d505fba WatchSource:0}: Error finding container 3183ec41558f6bf9a3f893811d8238cbfb5cc3ca87f7761fb15edcdd7d505fba: Status 404 returned error can't find the container with id 3183ec41558f6bf9a3f893811d8238cbfb5cc3ca87f7761fb15edcdd7d505fba Oct 10 08:14:23 crc kubenswrapper[4732]: I1010 08:14:23.676144 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56b201c3-e518-4d8b-8ab1-f4f3282ce836" path="/var/lib/kubelet/pods/56b201c3-e518-4d8b-8ab1-f4f3282ce836/volumes" Oct 10 08:14:24 crc kubenswrapper[4732]: I1010 08:14:24.159419 4732 generic.go:334] "Generic (PLEG): container finished" podID="62a92e5a-e387-4ebf-b610-f244c9c9bf7e" containerID="5d0c5d69e12e62bc0abb94c79dd19e65480ef2ee4c4110aba99ff17674f4c8ba" exitCode=0 Oct 10 08:14:24 crc kubenswrapper[4732]: I1010 08:14:24.159512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"62a92e5a-e387-4ebf-b610-f244c9c9bf7e","Type":"ContainerDied","Data":"5d0c5d69e12e62bc0abb94c79dd19e65480ef2ee4c4110aba99ff17674f4c8ba"} Oct 10 08:14:24 crc kubenswrapper[4732]: I1010 08:14:24.159906 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"62a92e5a-e387-4ebf-b610-f244c9c9bf7e","Type":"ContainerStarted","Data":"3183ec41558f6bf9a3f893811d8238cbfb5cc3ca87f7761fb15edcdd7d505fba"} Oct 10 08:14:25 crc kubenswrapper[4732]: I1010 08:14:25.538176 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:25 crc kubenswrapper[4732]: I1010 08:14:25.555545 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_62a92e5a-e387-4ebf-b610-f244c9c9bf7e/mariadb-client/0.log" Oct 10 08:14:25 crc kubenswrapper[4732]: I1010 08:14:25.579102 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:25 crc kubenswrapper[4732]: I1010 08:14:25.585572 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 10 08:14:25 crc kubenswrapper[4732]: I1010 08:14:25.718826 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glch9\" (UniqueName: \"kubernetes.io/projected/62a92e5a-e387-4ebf-b610-f244c9c9bf7e-kube-api-access-glch9\") pod \"62a92e5a-e387-4ebf-b610-f244c9c9bf7e\" (UID: \"62a92e5a-e387-4ebf-b610-f244c9c9bf7e\") " Oct 10 08:14:25 crc kubenswrapper[4732]: I1010 08:14:25.725156 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a92e5a-e387-4ebf-b610-f244c9c9bf7e-kube-api-access-glch9" (OuterVolumeSpecName: "kube-api-access-glch9") pod "62a92e5a-e387-4ebf-b610-f244c9c9bf7e" (UID: "62a92e5a-e387-4ebf-b610-f244c9c9bf7e"). InnerVolumeSpecName "kube-api-access-glch9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:14:25 crc kubenswrapper[4732]: I1010 08:14:25.821377 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glch9\" (UniqueName: \"kubernetes.io/projected/62a92e5a-e387-4ebf-b610-f244c9c9bf7e-kube-api-access-glch9\") on node \"crc\" DevicePath \"\"" Oct 10 08:14:26 crc kubenswrapper[4732]: I1010 08:14:26.179462 4732 scope.go:117] "RemoveContainer" containerID="5d0c5d69e12e62bc0abb94c79dd19e65480ef2ee4c4110aba99ff17674f4c8ba" Oct 10 08:14:26 crc kubenswrapper[4732]: I1010 08:14:26.179869 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 10 08:14:27 crc kubenswrapper[4732]: I1010 08:14:27.676788 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62a92e5a-e387-4ebf-b610-f244c9c9bf7e" path="/var/lib/kubelet/pods/62a92e5a-e387-4ebf-b610-f244c9c9bf7e/volumes" Oct 10 08:14:30 crc kubenswrapper[4732]: I1010 08:14:30.660104 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:14:30 crc kubenswrapper[4732]: E1010 08:14:30.660933 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:14:45 crc kubenswrapper[4732]: I1010 08:14:45.662279 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:14:45 crc kubenswrapper[4732]: E1010 08:14:45.663609 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:14:58 crc kubenswrapper[4732]: E1010 08:14:58.244273 4732 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.246:35176->38.102.83.246:40175: read tcp 38.102.83.246:35176->38.102.83.246:40175: read: connection reset by peer Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.151915 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7"] Oct 10 08:15:00 crc kubenswrapper[4732]: E1010 08:15:00.152749 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a92e5a-e387-4ebf-b610-f244c9c9bf7e" containerName="mariadb-client" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.152769 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a92e5a-e387-4ebf-b610-f244c9c9bf7e" containerName="mariadb-client" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.152986 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a92e5a-e387-4ebf-b610-f244c9c9bf7e" containerName="mariadb-client" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.153770 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.156919 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.157141 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.171618 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7"] Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.292949 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46prf\" (UniqueName: \"kubernetes.io/projected/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-kube-api-access-46prf\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.293085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-secret-volume\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.293220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-config-volume\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.394882 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46prf\" (UniqueName: \"kubernetes.io/projected/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-kube-api-access-46prf\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.394945 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-secret-volume\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.394988 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-config-volume\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.396259 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-config-volume\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.402722 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-secret-volume\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.426534 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46prf\" (UniqueName: \"kubernetes.io/projected/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-kube-api-access-46prf\") pod \"collect-profiles-29334735-hnhr7\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.481102 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.661291 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:15:00 crc kubenswrapper[4732]: E1010 08:15:00.661868 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:15:00 crc kubenswrapper[4732]: I1010 08:15:00.945265 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7"] Oct 10 08:15:01 crc kubenswrapper[4732]: I1010 08:15:01.532121 4732 generic.go:334] "Generic (PLEG): container finished" podID="49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" containerID="17ecf4b8ed3713739ca10fb0b787ff3b55b05ee3b2855b7d90b4ff2b25f53c9b" exitCode=0 Oct 10 08:15:01 crc kubenswrapper[4732]: I1010 08:15:01.532326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" event={"ID":"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c","Type":"ContainerDied","Data":"17ecf4b8ed3713739ca10fb0b787ff3b55b05ee3b2855b7d90b4ff2b25f53c9b"} Oct 10 08:15:01 crc kubenswrapper[4732]: I1010 08:15:01.532438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" event={"ID":"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c","Type":"ContainerStarted","Data":"cfaee459d85ff3639f8e83cf12c90bfe6144a4952d21bde83840dbe4ea56324f"} Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.692992 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.694461 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.698444 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.699305 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.699357 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.699377 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.699305 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-kmqfn" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.716093 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.718858 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.725982 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.727880 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.738121 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.755016 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.786917 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938332 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938376 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938455 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938487 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938535 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938580 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938593 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvw2m\" (UniqueName: \"kubernetes.io/projected/defba928-0686-4eb5-b82f-a3d81310408c-kube-api-access-tvw2m\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938609 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938744 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defba928-0686-4eb5-b82f-a3d81310408c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938808 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938843 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq6dt\" (UniqueName: \"kubernetes.io/projected/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-kube-api-access-rq6dt\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938877 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defba928-0686-4eb5-b82f-a3d81310408c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938899 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.938953 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939008 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939037 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939070 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d4839beb-1eac-412a-9e00-bd3283871967\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4839beb-1eac-412a-9e00-bd3283871967\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939239 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939284 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jrhf\" (UniqueName: \"kubernetes.io/projected/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-kube-api-access-5jrhf\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939327 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-config\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:02 crc kubenswrapper[4732]: I1010 08:15:02.939398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba928-0686-4eb5-b82f-a3d81310408c-config\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.041273 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.041319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.041343 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042046 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042093 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042115 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvw2m\" (UniqueName: \"kubernetes.io/projected/defba928-0686-4eb5-b82f-a3d81310408c-kube-api-access-tvw2m\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042144 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042181 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042201 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defba928-0686-4eb5-b82f-a3d81310408c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042221 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042254 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq6dt\" (UniqueName: \"kubernetes.io/projected/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-kube-api-access-rq6dt\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042292 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defba928-0686-4eb5-b82f-a3d81310408c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042314 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042358 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042380 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042405 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d4839beb-1eac-412a-9e00-bd3283871967\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4839beb-1eac-412a-9e00-bd3283871967\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042482 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jrhf\" (UniqueName: \"kubernetes.io/projected/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-kube-api-access-5jrhf\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042503 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042527 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042550 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-config\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba928-0686-4eb5-b82f-a3d81310408c-config\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042611 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042705 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.042949 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.043093 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/defba928-0686-4eb5-b82f-a3d81310408c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.043676 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-config\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.044191 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.044820 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/defba928-0686-4eb5-b82f-a3d81310408c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.045539 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.045566 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/edcb57f01ebc765009fcfaed353864dd71960eb7600f9d61d0c9388118ced68a/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.045926 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defba928-0686-4eb5-b82f-a3d81310408c-config\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.046613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-config\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.046728 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.047877 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.047914 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.048210 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.048232 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/74906fe606450a11bb6680658b4426cb425a22c6d845cd555b015949f81d1149/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.048828 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.049158 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.049205 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d4839beb-1eac-412a-9e00-bd3283871967\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4839beb-1eac-412a-9e00-bd3283871967\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be768fa894bac7d75a0ddacc22d12c0571b0b6f93a1a7096ddd5d41bd46ff941/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.049981 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.051278 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.052284 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/defba928-0686-4eb5-b82f-a3d81310408c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.052948 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.053665 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.066183 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jrhf\" (UniqueName: \"kubernetes.io/projected/55afdb4c-6296-4aed-881f-69bc7cfa7f2b-kube-api-access-5jrhf\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.067502 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.068630 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq6dt\" (UniqueName: \"kubernetes.io/projected/67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4-kube-api-access-rq6dt\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.071113 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvw2m\" (UniqueName: \"kubernetes.io/projected/defba928-0686-4eb5-b82f-a3d81310408c-kube-api-access-tvw2m\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.124677 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b8c693cf-0c30-4c0e-857d-f6f7f22b2cf6\") pod \"ovsdbserver-nb-2\" (UID: \"defba928-0686-4eb5-b82f-a3d81310408c\") " pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.125275 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d4839beb-1eac-412a-9e00-bd3283871967\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d4839beb-1eac-412a-9e00-bd3283871967\") pod \"ovsdbserver-nb-0\" (UID: \"55afdb4c-6296-4aed-881f-69bc7cfa7f2b\") " pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.132484 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fb64327-c941-45eb-b3ba-2be6020d2d86\") pod \"ovsdbserver-nb-1\" (UID: \"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4\") " pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.138948 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.246050 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-config-volume\") pod \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.246289 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-secret-volume\") pod \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.246370 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46prf\" (UniqueName: \"kubernetes.io/projected/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-kube-api-access-46prf\") pod \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\" (UID: \"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c\") " Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.246851 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-config-volume" (OuterVolumeSpecName: "config-volume") pod "49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" (UID: "49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.249528 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" (UID: "49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.250162 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-kube-api-access-46prf" (OuterVolumeSpecName: "kube-api-access-46prf") pod "49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" (UID: "49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c"). InnerVolumeSpecName "kube-api-access-46prf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.322838 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.346820 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.347758 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46prf\" (UniqueName: \"kubernetes.io/projected/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-kube-api-access-46prf\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.347802 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.347812 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.361827 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.549125 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" event={"ID":"49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c","Type":"ContainerDied","Data":"cfaee459d85ff3639f8e83cf12c90bfe6144a4952d21bde83840dbe4ea56324f"} Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.549168 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfaee459d85ff3639f8e83cf12c90bfe6144a4952d21bde83840dbe4ea56324f" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.549170 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.691022 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 10 08:15:03 crc kubenswrapper[4732]: W1010 08:15:03.693376 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55afdb4c_6296_4aed_881f_69bc7cfa7f2b.slice/crio-bd27d554af3d44e305f00f93cfd5ee0d99814398cd773a79676a44ec84c6a0e2 WatchSource:0}: Error finding container bd27d554af3d44e305f00f93cfd5ee0d99814398cd773a79676a44ec84c6a0e2: Status 404 returned error can't find the container with id bd27d554af3d44e305f00f93cfd5ee0d99814398cd773a79676a44ec84c6a0e2 Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.916497 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 08:15:03 crc kubenswrapper[4732]: E1010 08:15:03.917303 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" containerName="collect-profiles" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.917326 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" containerName="collect-profiles" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.917530 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" containerName="collect-profiles" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.918939 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.925249 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.927166 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.927403 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.931176 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2fslg" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.934562 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.942163 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.943899 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.956251 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.957538 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.957811 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcxfq\" (UniqueName: \"kubernetes.io/projected/855f8bf2-6c81-4281-9363-180138a2aea0-kube-api-access-wcxfq\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.957872 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.958196 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.958233 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f8bf2-6c81-4281-9363-180138a2aea0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.958256 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.958277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f8bf2-6c81-4281-9363-180138a2aea0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.958337 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.958367 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f8bf2-6c81-4281-9363-180138a2aea0-config\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.962980 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 10 08:15:03 crc kubenswrapper[4732]: I1010 08:15:03.990465 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.035873 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.059306 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcxfq\" (UniqueName: \"kubernetes.io/projected/855f8bf2-6c81-4281-9363-180138a2aea0-kube-api-access-wcxfq\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.059422 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2755dd3e-b135-4dcc-8016-7f1034232bb9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.059741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.059827 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-46bd009e-dfef-4fae-9187-c38d801be87d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46bd009e-dfef-4fae-9187-c38d801be87d\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.059896 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.059980 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060076 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060147 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f8bf2-6c81-4281-9363-180138a2aea0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060209 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060280 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f8bf2-6c81-4281-9363-180138a2aea0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060369 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2755dd3e-b135-4dcc-8016-7f1034232bb9-config\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060437 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5qb\" (UniqueName: \"kubernetes.io/projected/2755dd3e-b135-4dcc-8016-7f1034232bb9-kube-api-access-cd5qb\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060502 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060571 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060637 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2755dd3e-b135-4dcc-8016-7f1034232bb9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.060729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f8bf2-6c81-4281-9363-180138a2aea0-config\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.061304 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f8bf2-6c81-4281-9363-180138a2aea0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.061764 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f8bf2-6c81-4281-9363-180138a2aea0-config\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.062493 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f8bf2-6c81-4281-9363-180138a2aea0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.063048 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.063083 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/548726d9fd63ec556596cdf9e5cf64522f8f594535afd734357fb3066c69b1a5/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.065332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.065410 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.068493 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f8bf2-6c81-4281-9363-180138a2aea0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.078014 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcxfq\" (UniqueName: \"kubernetes.io/projected/855f8bf2-6c81-4281-9363-180138a2aea0-kube-api-access-wcxfq\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.096167 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-794d58a2-ee64-4d43-a514-b64a5d53343e\") pod \"ovsdbserver-sb-0\" (UID: \"855f8bf2-6c81-4281-9363-180138a2aea0\") " pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.129572 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 10 08:15:04 crc kubenswrapper[4732]: W1010 08:15:04.130408 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddefba928_0686_4eb5_b82f_a3d81310408c.slice/crio-d20f8bc2e8a8aee2f8f2a8e6377945c304cca29907734848a56cd367c4248959 WatchSource:0}: Error finding container d20f8bc2e8a8aee2f8f2a8e6377945c304cca29907734848a56cd367c4248959: Status 404 returned error can't find the container with id d20f8bc2e8a8aee2f8f2a8e6377945c304cca29907734848a56cd367c4248959 Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164283 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2755dd3e-b135-4dcc-8016-7f1034232bb9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164309 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164331 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-46bd009e-dfef-4fae-9187-c38d801be87d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46bd009e-dfef-4fae-9187-c38d801be87d\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164357 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164384 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c9473b7d-16bd-43aa-b01b-c977c828b6ad-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164421 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164456 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164475 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164496 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2755dd3e-b135-4dcc-8016-7f1034232bb9-config\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164515 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5qb\" (UniqueName: \"kubernetes.io/projected/2755dd3e-b135-4dcc-8016-7f1034232bb9-kube-api-access-cd5qb\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164532 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9473b7d-16bd-43aa-b01b-c977c828b6ad-config\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164548 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9473b7d-16bd-43aa-b01b-c977c828b6ad-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2755dd3e-b135-4dcc-8016-7f1034232bb9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.164611 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lcz\" (UniqueName: \"kubernetes.io/projected/c9473b7d-16bd-43aa-b01b-c977c828b6ad-kube-api-access-68lcz\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.166499 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2755dd3e-b135-4dcc-8016-7f1034232bb9-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.168918 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2755dd3e-b135-4dcc-8016-7f1034232bb9-config\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.170660 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.170718 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.172561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2755dd3e-b135-4dcc-8016-7f1034232bb9-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.176229 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.176773 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-46bd009e-dfef-4fae-9187-c38d801be87d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46bd009e-dfef-4fae-9187-c38d801be87d\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/47581bab18bee82851d4bff7172775d2af3c9a9b7dcbbe0417f8dd93e46b27f8/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.184494 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2755dd3e-b135-4dcc-8016-7f1034232bb9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.197247 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5qb\" (UniqueName: \"kubernetes.io/projected/2755dd3e-b135-4dcc-8016-7f1034232bb9-kube-api-access-cd5qb\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.218472 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6"] Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.224006 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334690-zzsp6"] Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.225130 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-46bd009e-dfef-4fae-9187-c38d801be87d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-46bd009e-dfef-4fae-9187-c38d801be87d\") pod \"ovsdbserver-sb-2\" (UID: \"2755dd3e-b135-4dcc-8016-7f1034232bb9\") " pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.244652 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266394 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266475 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266508 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266542 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9473b7d-16bd-43aa-b01b-c977c828b6ad-config\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266573 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9473b7d-16bd-43aa-b01b-c977c828b6ad-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266607 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lcz\" (UniqueName: \"kubernetes.io/projected/c9473b7d-16bd-43aa-b01b-c977c828b6ad-kube-api-access-68lcz\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.266765 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c9473b7d-16bd-43aa-b01b-c977c828b6ad-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.267309 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c9473b7d-16bd-43aa-b01b-c977c828b6ad-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.268142 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9473b7d-16bd-43aa-b01b-c977c828b6ad-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.268357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9473b7d-16bd-43aa-b01b-c977c828b6ad-config\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.270301 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.270714 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.271095 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.271131 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a8c53dfef7836a89b49a0ff6ad41afc4ee912b4766da4cd33f1be5e00fb116fc/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.272557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.285395 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9473b7d-16bd-43aa-b01b-c977c828b6ad-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.288000 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lcz\" (UniqueName: \"kubernetes.io/projected/c9473b7d-16bd-43aa-b01b-c977c828b6ad-kube-api-access-68lcz\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.302555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c1f55200-b210-46ee-9a0d-56bf31fa2b1c\") pod \"ovsdbserver-sb-1\" (UID: \"c9473b7d-16bd-43aa-b01b-c977c828b6ad\") " pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.560540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4","Type":"ContainerStarted","Data":"6d94ee0c19130271909d1062220f6e7365eab5874b84588563f5fa62c3012a59"} Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.562150 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"defba928-0686-4eb5-b82f-a3d81310408c","Type":"ContainerStarted","Data":"d20f8bc2e8a8aee2f8f2a8e6377945c304cca29907734848a56cd367c4248959"} Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.563967 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55afdb4c-6296-4aed-881f-69bc7cfa7f2b","Type":"ContainerStarted","Data":"bd27d554af3d44e305f00f93cfd5ee0d99814398cd773a79676a44ec84c6a0e2"} Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.581406 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.762128 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 10 08:15:04 crc kubenswrapper[4732]: W1010 08:15:04.780482 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855f8bf2_6c81_4281_9363_180138a2aea0.slice/crio-513fb00503c6647d33cf4ea763662f203e8662da11918a270b56feff0fe3b4b8 WatchSource:0}: Error finding container 513fb00503c6647d33cf4ea763662f203e8662da11918a270b56feff0fe3b4b8: Status 404 returned error can't find the container with id 513fb00503c6647d33cf4ea763662f203e8662da11918a270b56feff0fe3b4b8 Oct 10 08:15:04 crc kubenswrapper[4732]: I1010 08:15:04.857339 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 10 08:15:04 crc kubenswrapper[4732]: W1010 08:15:04.866563 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2755dd3e_b135_4dcc_8016_7f1034232bb9.slice/crio-0b66a27ba764d7af7278bf26d0f3d91ab537138888e6a43f5af34c3d742cd444 WatchSource:0}: Error finding container 0b66a27ba764d7af7278bf26d0f3d91ab537138888e6a43f5af34c3d742cd444: Status 404 returned error can't find the container with id 0b66a27ba764d7af7278bf26d0f3d91ab537138888e6a43f5af34c3d742cd444 Oct 10 08:15:05 crc kubenswrapper[4732]: I1010 08:15:05.074119 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 10 08:15:05 crc kubenswrapper[4732]: W1010 08:15:05.083362 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9473b7d_16bd_43aa_b01b_c977c828b6ad.slice/crio-540e13ad6ae4847da7c9fd24398e99190b71277170e795c8f115443a57aefe5c WatchSource:0}: Error finding container 540e13ad6ae4847da7c9fd24398e99190b71277170e795c8f115443a57aefe5c: Status 404 returned error can't find the container with id 540e13ad6ae4847da7c9fd24398e99190b71277170e795c8f115443a57aefe5c Oct 10 08:15:05 crc kubenswrapper[4732]: I1010 08:15:05.573845 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c9473b7d-16bd-43aa-b01b-c977c828b6ad","Type":"ContainerStarted","Data":"540e13ad6ae4847da7c9fd24398e99190b71277170e795c8f115443a57aefe5c"} Oct 10 08:15:05 crc kubenswrapper[4732]: I1010 08:15:05.575453 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2755dd3e-b135-4dcc-8016-7f1034232bb9","Type":"ContainerStarted","Data":"0b66a27ba764d7af7278bf26d0f3d91ab537138888e6a43f5af34c3d742cd444"} Oct 10 08:15:05 crc kubenswrapper[4732]: I1010 08:15:05.577088 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"855f8bf2-6c81-4281-9363-180138a2aea0","Type":"ContainerStarted","Data":"513fb00503c6647d33cf4ea763662f203e8662da11918a270b56feff0fe3b4b8"} Oct 10 08:15:05 crc kubenswrapper[4732]: I1010 08:15:05.670140 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2965b0ef-afde-4345-8b09-d1e84cbdf542" path="/var/lib/kubelet/pods/2965b0ef-afde-4345-8b09-d1e84cbdf542/volumes" Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.615471 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"855f8bf2-6c81-4281-9363-180138a2aea0","Type":"ContainerStarted","Data":"d9a402a2b5dabd352b41ce7c5adcebdbbaffc6ced71705d94bac61a65da4ea97"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.616113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"855f8bf2-6c81-4281-9363-180138a2aea0","Type":"ContainerStarted","Data":"9e0ca62f8701851e10ec1c8ba15226532288f36f7eaa43e258c9733f08ebd792"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.618335 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"defba928-0686-4eb5-b82f-a3d81310408c","Type":"ContainerStarted","Data":"fd14a5c76e112b36818f3c3ddea90fd2b753f79e7ff7943e54d729817774d800"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.618376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"defba928-0686-4eb5-b82f-a3d81310408c","Type":"ContainerStarted","Data":"36c77b27c5e2e93cb3a47e046994a7b078a88b1ccaac01e593de4fad1bdcc3fd"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.621479 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55afdb4c-6296-4aed-881f-69bc7cfa7f2b","Type":"ContainerStarted","Data":"e392ce6a56c897a51b643708e81597b59b189874fe08e0a5ae58611634cd89d8"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.621548 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55afdb4c-6296-4aed-881f-69bc7cfa7f2b","Type":"ContainerStarted","Data":"d3e35f15036904c792520a743c5fe665678660957b1d99aa66f4c3aa03bec488"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.625764 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c9473b7d-16bd-43aa-b01b-c977c828b6ad","Type":"ContainerStarted","Data":"adc0096e25c123ccafdd3dbdf76b3b14998b283c71b0ef425f7b245e38e60899"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.625834 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"c9473b7d-16bd-43aa-b01b-c977c828b6ad","Type":"ContainerStarted","Data":"0ddbf01e664df85096dcfef022d56df7e87b01190f731c8d6e152980fa4c257b"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.635112 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4","Type":"ContainerStarted","Data":"99872fce1aa4877c1fbb58b861d2adfe965d6bc593bde4f39f322e4eddf9932a"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.635206 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4","Type":"ContainerStarted","Data":"93cfeb145f2d619fbe9f4433de27f3fe228c16ce191108fcc331659a8911aca7"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.649577 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.297908331 podStartE2EDuration="7.649547748s" podCreationTimestamp="2025-10-10 08:15:02 +0000 UTC" firstStartedPulling="2025-10-10 08:15:04.787250928 +0000 UTC m=+5031.856842169" lastFinishedPulling="2025-10-10 08:15:08.138890345 +0000 UTC m=+5035.208481586" observedRunningTime="2025-10-10 08:15:09.645392566 +0000 UTC m=+5036.714983857" watchObservedRunningTime="2025-10-10 08:15:09.649547748 +0000 UTC m=+5036.719139029" Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.658174 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2755dd3e-b135-4dcc-8016-7f1034232bb9","Type":"ContainerStarted","Data":"4a3a9d576484324729bbd06b276a102712aa9ae6dc2f8fa9ca6b48c50a7b8bf4"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.658262 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2755dd3e-b135-4dcc-8016-7f1034232bb9","Type":"ContainerStarted","Data":"33c8d69732a91a140680927316affc02fb8558479d1f0568decbe4ef36762561"} Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.677387 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.590535548 podStartE2EDuration="8.676570987s" podCreationTimestamp="2025-10-10 08:15:01 +0000 UTC" firstStartedPulling="2025-10-10 08:15:04.052846046 +0000 UTC m=+5031.122437287" lastFinishedPulling="2025-10-10 08:15:08.138881485 +0000 UTC m=+5035.208472726" observedRunningTime="2025-10-10 08:15:09.67482919 +0000 UTC m=+5036.744420461" watchObservedRunningTime="2025-10-10 08:15:09.676570987 +0000 UTC m=+5036.746162238" Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.702400 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.254739271 podStartE2EDuration="8.702377324s" podCreationTimestamp="2025-10-10 08:15:01 +0000 UTC" firstStartedPulling="2025-10-10 08:15:03.696484343 +0000 UTC m=+5030.766075584" lastFinishedPulling="2025-10-10 08:15:08.144122396 +0000 UTC m=+5035.213713637" observedRunningTime="2025-10-10 08:15:09.698295423 +0000 UTC m=+5036.767886684" watchObservedRunningTime="2025-10-10 08:15:09.702377324 +0000 UTC m=+5036.771968595" Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.731814 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.347084448 podStartE2EDuration="7.731796367s" podCreationTimestamp="2025-10-10 08:15:02 +0000 UTC" firstStartedPulling="2025-10-10 08:15:05.101487415 +0000 UTC m=+5032.171078656" lastFinishedPulling="2025-10-10 08:15:08.486199324 +0000 UTC m=+5035.555790575" observedRunningTime="2025-10-10 08:15:09.722550258 +0000 UTC m=+5036.792141559" watchObservedRunningTime="2025-10-10 08:15:09.731796367 +0000 UTC m=+5036.801387618" Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.756031 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.750049912 podStartE2EDuration="8.75600519s" podCreationTimestamp="2025-10-10 08:15:01 +0000 UTC" firstStartedPulling="2025-10-10 08:15:04.132217968 +0000 UTC m=+5031.201809209" lastFinishedPulling="2025-10-10 08:15:08.138173246 +0000 UTC m=+5035.207764487" observedRunningTime="2025-10-10 08:15:09.743543774 +0000 UTC m=+5036.813135075" watchObservedRunningTime="2025-10-10 08:15:09.75600519 +0000 UTC m=+5036.825596461" Oct 10 08:15:09 crc kubenswrapper[4732]: I1010 08:15:09.771886 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.486246053 podStartE2EDuration="7.771856718s" podCreationTimestamp="2025-10-10 08:15:02 +0000 UTC" firstStartedPulling="2025-10-10 08:15:04.869070996 +0000 UTC m=+5031.938662247" lastFinishedPulling="2025-10-10 08:15:08.154681671 +0000 UTC m=+5035.224272912" observedRunningTime="2025-10-10 08:15:09.770027578 +0000 UTC m=+5036.839618849" watchObservedRunningTime="2025-10-10 08:15:09.771856718 +0000 UTC m=+5036.841448029" Oct 10 08:15:10 crc kubenswrapper[4732]: I1010 08:15:10.245059 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:10 crc kubenswrapper[4732]: I1010 08:15:10.271496 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:10 crc kubenswrapper[4732]: I1010 08:15:10.581818 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.323286 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.347603 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.362621 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.390127 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.390287 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.421844 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.713557 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.715803 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:12 crc kubenswrapper[4732]: I1010 08:15:12.717098 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.321286 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.321771 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.338552 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.339358 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.386983 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.402978 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.405916 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.416104 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.419568 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.590438 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b77fb4d9c-7hbpc"] Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.592001 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.594604 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.619211 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b77fb4d9c-7hbpc"] Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.634762 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.636178 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.708491 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.744289 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-ovsdbserver-nb\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.744335 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmn9\" (UniqueName: \"kubernetes.io/projected/4f2d6445-cc44-449e-8651-0256b1fe7a24-kube-api-access-wgmn9\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.744439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-dns-svc\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.744462 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-config\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.790110 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b77fb4d9c-7hbpc"] Oct 10 08:15:13 crc kubenswrapper[4732]: E1010 08:15:13.791512 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-wgmn9 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" podUID="4f2d6445-cc44-449e-8651-0256b1fe7a24" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.823609 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59c7f8f76f-vk6wv"] Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.824874 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.830643 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.837560 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c7f8f76f-vk6wv"] Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.845583 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-dns-svc\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.845622 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-config\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.845871 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-ovsdbserver-nb\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.845907 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmn9\" (UniqueName: \"kubernetes.io/projected/4f2d6445-cc44-449e-8651-0256b1fe7a24-kube-api-access-wgmn9\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.847194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-dns-svc\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.847799 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-config\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.848985 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.858221 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-ovsdbserver-nb\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:13 crc kubenswrapper[4732]: I1010 08:15:13.867242 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmn9\" (UniqueName: \"kubernetes.io/projected/4f2d6445-cc44-449e-8651-0256b1fe7a24-kube-api-access-wgmn9\") pod \"dnsmasq-dns-5b77fb4d9c-7hbpc\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.050205 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-dns-svc\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.050281 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-config\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.050401 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-sb\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.050521 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-nb\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.050640 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbpqf\" (UniqueName: \"kubernetes.io/projected/615c5712-3935-437d-bd82-3b70be8299df-kube-api-access-xbpqf\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.151857 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-dns-svc\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.151949 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-config\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.152005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-sb\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.152052 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-nb\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.152113 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbpqf\" (UniqueName: \"kubernetes.io/projected/615c5712-3935-437d-bd82-3b70be8299df-kube-api-access-xbpqf\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.152890 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-dns-svc\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.152977 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-config\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.153032 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-sb\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.153332 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-nb\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.174229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbpqf\" (UniqueName: \"kubernetes.io/projected/615c5712-3935-437d-bd82-3b70be8299df-kube-api-access-xbpqf\") pod \"dnsmasq-dns-59c7f8f76f-vk6wv\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.225073 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.660913 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:15:14 crc kubenswrapper[4732]: E1010 08:15:14.661980 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.682302 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59c7f8f76f-vk6wv"] Oct 10 08:15:14 crc kubenswrapper[4732]: W1010 08:15:14.688859 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod615c5712_3935_437d_bd82_3b70be8299df.slice/crio-42bfcc8347387036955be15eddbd06887d45ae1807681f1136d585ee1adf2eaa WatchSource:0}: Error finding container 42bfcc8347387036955be15eddbd06887d45ae1807681f1136d585ee1adf2eaa: Status 404 returned error can't find the container with id 42bfcc8347387036955be15eddbd06887d45ae1807681f1136d585ee1adf2eaa Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.751752 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.752015 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" event={"ID":"615c5712-3935-437d-bd82-3b70be8299df","Type":"ContainerStarted","Data":"42bfcc8347387036955be15eddbd06887d45ae1807681f1136d585ee1adf2eaa"} Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.766458 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.868024 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-dns-svc\") pod \"4f2d6445-cc44-449e-8651-0256b1fe7a24\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.868077 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-config\") pod \"4f2d6445-cc44-449e-8651-0256b1fe7a24\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.868146 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgmn9\" (UniqueName: \"kubernetes.io/projected/4f2d6445-cc44-449e-8651-0256b1fe7a24-kube-api-access-wgmn9\") pod \"4f2d6445-cc44-449e-8651-0256b1fe7a24\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.868222 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-ovsdbserver-nb\") pod \"4f2d6445-cc44-449e-8651-0256b1fe7a24\" (UID: \"4f2d6445-cc44-449e-8651-0256b1fe7a24\") " Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.868759 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f2d6445-cc44-449e-8651-0256b1fe7a24" (UID: "4f2d6445-cc44-449e-8651-0256b1fe7a24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.869364 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f2d6445-cc44-449e-8651-0256b1fe7a24" (UID: "4f2d6445-cc44-449e-8651-0256b1fe7a24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.869553 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.869589 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.871406 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-config" (OuterVolumeSpecName: "config") pod "4f2d6445-cc44-449e-8651-0256b1fe7a24" (UID: "4f2d6445-cc44-449e-8651-0256b1fe7a24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.873070 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2d6445-cc44-449e-8651-0256b1fe7a24-kube-api-access-wgmn9" (OuterVolumeSpecName: "kube-api-access-wgmn9") pod "4f2d6445-cc44-449e-8651-0256b1fe7a24" (UID: "4f2d6445-cc44-449e-8651-0256b1fe7a24"). InnerVolumeSpecName "kube-api-access-wgmn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.970497 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2d6445-cc44-449e-8651-0256b1fe7a24-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:14 crc kubenswrapper[4732]: I1010 08:15:14.970525 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgmn9\" (UniqueName: \"kubernetes.io/projected/4f2d6445-cc44-449e-8651-0256b1fe7a24-kube-api-access-wgmn9\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:15 crc kubenswrapper[4732]: I1010 08:15:15.762955 4732 generic.go:334] "Generic (PLEG): container finished" podID="615c5712-3935-437d-bd82-3b70be8299df" containerID="7c9defd0248280428792c98933b9b7afc1b84199fe82ae3934a01105b1adb60f" exitCode=0 Oct 10 08:15:15 crc kubenswrapper[4732]: I1010 08:15:15.763240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" event={"ID":"615c5712-3935-437d-bd82-3b70be8299df","Type":"ContainerDied","Data":"7c9defd0248280428792c98933b9b7afc1b84199fe82ae3934a01105b1adb60f"} Oct 10 08:15:15 crc kubenswrapper[4732]: I1010 08:15:15.763288 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b77fb4d9c-7hbpc" Oct 10 08:15:15 crc kubenswrapper[4732]: I1010 08:15:15.809912 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b77fb4d9c-7hbpc"] Oct 10 08:15:15 crc kubenswrapper[4732]: I1010 08:15:15.816896 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b77fb4d9c-7hbpc"] Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.534205 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.537308 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.540945 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.546005 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.599979 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsrlr\" (UniqueName: \"kubernetes.io/projected/5d21ef00-0975-464e-9ebf-c36b2e1c101e-kube-api-access-zsrlr\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.600167 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5d21ef00-0975-464e-9ebf-c36b2e1c101e-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.600251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e9bde394-af66-4806-b48a-581bdca4a910\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.701511 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsrlr\" (UniqueName: \"kubernetes.io/projected/5d21ef00-0975-464e-9ebf-c36b2e1c101e-kube-api-access-zsrlr\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.701586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5d21ef00-0975-464e-9ebf-c36b2e1c101e-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.701614 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e9bde394-af66-4806-b48a-581bdca4a910\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.705972 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.706044 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e9bde394-af66-4806-b48a-581bdca4a910\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e9c70983564777f3e9e1c657e7d58a980cee931ca0d3b6da0b77f2163d97bdde/globalmount\"" pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.717649 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5d21ef00-0975-464e-9ebf-c36b2e1c101e-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.743658 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsrlr\" (UniqueName: \"kubernetes.io/projected/5d21ef00-0975-464e-9ebf-c36b2e1c101e-kube-api-access-zsrlr\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.746111 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e9bde394-af66-4806-b48a-581bdca4a910\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910\") pod \"ovn-copy-data\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " pod="openstack/ovn-copy-data" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.788169 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" event={"ID":"615c5712-3935-437d-bd82-3b70be8299df","Type":"ContainerStarted","Data":"e469593e2af201ad667e13f8d566d1d6d25fbd584a1ba65098870622281670ff"} Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.788888 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.811248 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" podStartSLOduration=3.811226407 podStartE2EDuration="3.811226407s" podCreationTimestamp="2025-10-10 08:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:15:16.807478815 +0000 UTC m=+5043.877070086" watchObservedRunningTime="2025-10-10 08:15:16.811226407 +0000 UTC m=+5043.880817658" Oct 10 08:15:16 crc kubenswrapper[4732]: I1010 08:15:16.866177 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 08:15:17 crc kubenswrapper[4732]: I1010 08:15:17.373615 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 08:15:17 crc kubenswrapper[4732]: I1010 08:15:17.674727 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2d6445-cc44-449e-8651-0256b1fe7a24" path="/var/lib/kubelet/pods/4f2d6445-cc44-449e-8651-0256b1fe7a24/volumes" Oct 10 08:15:17 crc kubenswrapper[4732]: I1010 08:15:17.796463 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5d21ef00-0975-464e-9ebf-c36b2e1c101e","Type":"ContainerStarted","Data":"ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06"} Oct 10 08:15:17 crc kubenswrapper[4732]: I1010 08:15:17.796519 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5d21ef00-0975-464e-9ebf-c36b2e1c101e","Type":"ContainerStarted","Data":"2da36ac2fb5ef4433921c4e1f1a7cc24d71611ab021d396dd9ef11c2d07ab0c0"} Oct 10 08:15:17 crc kubenswrapper[4732]: I1010 08:15:17.814017 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.591689639 podStartE2EDuration="2.814002267s" podCreationTimestamp="2025-10-10 08:15:15 +0000 UTC" firstStartedPulling="2025-10-10 08:15:17.375853747 +0000 UTC m=+5044.445444998" lastFinishedPulling="2025-10-10 08:15:17.598166345 +0000 UTC m=+5044.667757626" observedRunningTime="2025-10-10 08:15:17.813542265 +0000 UTC m=+5044.883133526" watchObservedRunningTime="2025-10-10 08:15:17.814002267 +0000 UTC m=+5044.883593508" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.642329 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.644663 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.647988 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.647991 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.648120 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.648886 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-d27rv" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.683328 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.737010 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94t7b\" (UniqueName: \"kubernetes.io/projected/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-kube-api-access-94t7b\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.737298 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.737445 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.737605 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.737820 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.738518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-config\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.738648 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-scripts\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.840556 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.840673 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-config\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.840774 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-scripts\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.840835 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94t7b\" (UniqueName: \"kubernetes.io/projected/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-kube-api-access-94t7b\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.840863 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.840919 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.840972 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.841747 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.841893 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-config\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.841958 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-scripts\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.846846 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.850498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.851465 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.868064 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94t7b\" (UniqueName: \"kubernetes.io/projected/5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99-kube-api-access-94t7b\") pod \"ovn-northd-0\" (UID: \"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99\") " pod="openstack/ovn-northd-0" Oct 10 08:15:23 crc kubenswrapper[4732]: I1010 08:15:23.987052 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.228091 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.287125 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8549d7dd49-h2t57"] Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.295214 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" podUID="9ff69888-543b-4450-8760-23f55ccbb673" containerName="dnsmasq-dns" containerID="cri-o://ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f" gracePeriod=10 Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.455479 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 10 08:15:24 crc kubenswrapper[4732]: W1010 08:15:24.466588 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5cb6cb9a_2b53_47db_a421_c0a9f8e0ea99.slice/crio-3a00544e3ed824a1ef0282714df34c8a4fd4af4c74510a2ee0e95dea48e73e6e WatchSource:0}: Error finding container 3a00544e3ed824a1ef0282714df34c8a4fd4af4c74510a2ee0e95dea48e73e6e: Status 404 returned error can't find the container with id 3a00544e3ed824a1ef0282714df34c8a4fd4af4c74510a2ee0e95dea48e73e6e Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.718539 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.756288 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-dns-svc\") pod \"9ff69888-543b-4450-8760-23f55ccbb673\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.756376 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrp48\" (UniqueName: \"kubernetes.io/projected/9ff69888-543b-4450-8760-23f55ccbb673-kube-api-access-nrp48\") pod \"9ff69888-543b-4450-8760-23f55ccbb673\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.756456 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-config\") pod \"9ff69888-543b-4450-8760-23f55ccbb673\" (UID: \"9ff69888-543b-4450-8760-23f55ccbb673\") " Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.763392 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff69888-543b-4450-8760-23f55ccbb673-kube-api-access-nrp48" (OuterVolumeSpecName: "kube-api-access-nrp48") pod "9ff69888-543b-4450-8760-23f55ccbb673" (UID: "9ff69888-543b-4450-8760-23f55ccbb673"). InnerVolumeSpecName "kube-api-access-nrp48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.803087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-config" (OuterVolumeSpecName: "config") pod "9ff69888-543b-4450-8760-23f55ccbb673" (UID: "9ff69888-543b-4450-8760-23f55ccbb673"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.807969 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ff69888-543b-4450-8760-23f55ccbb673" (UID: "9ff69888-543b-4450-8760-23f55ccbb673"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.858463 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrp48\" (UniqueName: \"kubernetes.io/projected/9ff69888-543b-4450-8760-23f55ccbb673-kube-api-access-nrp48\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.858493 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.858509 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ff69888-543b-4450-8760-23f55ccbb673-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.867944 4732 generic.go:334] "Generic (PLEG): container finished" podID="9ff69888-543b-4450-8760-23f55ccbb673" containerID="ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f" exitCode=0 Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.868024 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.868048 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" event={"ID":"9ff69888-543b-4450-8760-23f55ccbb673","Type":"ContainerDied","Data":"ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f"} Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.868097 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8549d7dd49-h2t57" event={"ID":"9ff69888-543b-4450-8760-23f55ccbb673","Type":"ContainerDied","Data":"e1bca6895ca2e7e9911a985c434b7508e2c99b1bfe6120f7ae6da34f7800c02d"} Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.868117 4732 scope.go:117] "RemoveContainer" containerID="ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f" Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.870478 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99","Type":"ContainerStarted","Data":"3a00544e3ed824a1ef0282714df34c8a4fd4af4c74510a2ee0e95dea48e73e6e"} Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.918665 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8549d7dd49-h2t57"] Oct 10 08:15:24 crc kubenswrapper[4732]: I1010 08:15:24.924012 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8549d7dd49-h2t57"] Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.070767 4732 scope.go:117] "RemoveContainer" containerID="c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.112919 4732 scope.go:117] "RemoveContainer" containerID="ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f" Oct 10 08:15:25 crc kubenswrapper[4732]: E1010 08:15:25.113533 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f\": container with ID starting with ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f not found: ID does not exist" containerID="ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.113934 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f"} err="failed to get container status \"ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f\": rpc error: code = NotFound desc = could not find container \"ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f\": container with ID starting with ca182c8dca693266ed885c9fe42de01ee3c2c29a03afd8c79bd2ebe49f596c3f not found: ID does not exist" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.114004 4732 scope.go:117] "RemoveContainer" containerID="c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02" Oct 10 08:15:25 crc kubenswrapper[4732]: E1010 08:15:25.114478 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02\": container with ID starting with c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02 not found: ID does not exist" containerID="c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.114541 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02"} err="failed to get container status \"c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02\": rpc error: code = NotFound desc = could not find container \"c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02\": container with ID starting with c9f213bf260a8c7641731dd388ddb47052df3513864e00b16dcb9bea3fac6d02 not found: ID does not exist" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.149924 4732 scope.go:117] "RemoveContainer" containerID="41b83a38c9c7e348505e3e383607118be28f16cc37539af71c899957b698a648" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.254388 4732 scope.go:117] "RemoveContainer" containerID="18e4ac8756d3f4bde508164449ac6730f79866e994d4a05d74cd56dcbc62842e" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.672507 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff69888-543b-4450-8760-23f55ccbb673" path="/var/lib/kubelet/pods/9ff69888-543b-4450-8760-23f55ccbb673/volumes" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.884240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99","Type":"ContainerStarted","Data":"fd277cec406160dfbb5db7f8a77da63e8511ac8650acca8c24b0094de6dd9b2d"} Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.884298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99","Type":"ContainerStarted","Data":"785729701ac023a2d7846e151d86b6c5be7e575006b5d38875c0c5d1e6e5d033"} Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.884462 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 10 08:15:25 crc kubenswrapper[4732]: I1010 08:15:25.918218 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.2660612110000002 podStartE2EDuration="2.918198873s" podCreationTimestamp="2025-10-10 08:15:23 +0000 UTC" firstStartedPulling="2025-10-10 08:15:24.470436017 +0000 UTC m=+5051.540027258" lastFinishedPulling="2025-10-10 08:15:25.122573679 +0000 UTC m=+5052.192164920" observedRunningTime="2025-10-10 08:15:25.914277387 +0000 UTC m=+5052.983868698" watchObservedRunningTime="2025-10-10 08:15:25.918198873 +0000 UTC m=+5052.987790134" Oct 10 08:15:27 crc kubenswrapper[4732]: I1010 08:15:27.660892 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:15:27 crc kubenswrapper[4732]: E1010 08:15:27.663200 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.729441 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bt4s6"] Oct 10 08:15:28 crc kubenswrapper[4732]: E1010 08:15:28.729840 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff69888-543b-4450-8760-23f55ccbb673" containerName="dnsmasq-dns" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.729855 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff69888-543b-4450-8760-23f55ccbb673" containerName="dnsmasq-dns" Oct 10 08:15:28 crc kubenswrapper[4732]: E1010 08:15:28.729870 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff69888-543b-4450-8760-23f55ccbb673" containerName="init" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.729878 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff69888-543b-4450-8760-23f55ccbb673" containerName="init" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.730080 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff69888-543b-4450-8760-23f55ccbb673" containerName="dnsmasq-dns" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.730694 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bt4s6" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.740066 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bt4s6"] Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.833374 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc8f5\" (UniqueName: \"kubernetes.io/projected/fb5a352b-80c6-4aa1-b84c-f065c6d3ced6-kube-api-access-vc8f5\") pod \"keystone-db-create-bt4s6\" (UID: \"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6\") " pod="openstack/keystone-db-create-bt4s6" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.935354 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc8f5\" (UniqueName: \"kubernetes.io/projected/fb5a352b-80c6-4aa1-b84c-f065c6d3ced6-kube-api-access-vc8f5\") pod \"keystone-db-create-bt4s6\" (UID: \"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6\") " pod="openstack/keystone-db-create-bt4s6" Oct 10 08:15:28 crc kubenswrapper[4732]: I1010 08:15:28.954435 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc8f5\" (UniqueName: \"kubernetes.io/projected/fb5a352b-80c6-4aa1-b84c-f065c6d3ced6-kube-api-access-vc8f5\") pod \"keystone-db-create-bt4s6\" (UID: \"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6\") " pod="openstack/keystone-db-create-bt4s6" Oct 10 08:15:29 crc kubenswrapper[4732]: I1010 08:15:29.054023 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bt4s6" Oct 10 08:15:29 crc kubenswrapper[4732]: I1010 08:15:29.512037 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bt4s6"] Oct 10 08:15:29 crc kubenswrapper[4732]: I1010 08:15:29.928770 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bt4s6" event={"ID":"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6","Type":"ContainerStarted","Data":"f5f1ae71b717cdc730140beaf1ca65d2ddcf46f608c3d80a7944271cab646f4b"} Oct 10 08:15:29 crc kubenswrapper[4732]: I1010 08:15:29.928811 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bt4s6" event={"ID":"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6","Type":"ContainerStarted","Data":"59c5826a130559e0dd9a52c82a8cf1ded5e82a12aa362a84ad97ae1b350fa635"} Oct 10 08:15:29 crc kubenswrapper[4732]: I1010 08:15:29.946097 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bt4s6" podStartSLOduration=1.946073532 podStartE2EDuration="1.946073532s" podCreationTimestamp="2025-10-10 08:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:15:29.944999823 +0000 UTC m=+5057.014591064" watchObservedRunningTime="2025-10-10 08:15:29.946073532 +0000 UTC m=+5057.015664813" Oct 10 08:15:30 crc kubenswrapper[4732]: I1010 08:15:30.942773 4732 generic.go:334] "Generic (PLEG): container finished" podID="fb5a352b-80c6-4aa1-b84c-f065c6d3ced6" containerID="f5f1ae71b717cdc730140beaf1ca65d2ddcf46f608c3d80a7944271cab646f4b" exitCode=0 Oct 10 08:15:30 crc kubenswrapper[4732]: I1010 08:15:30.943188 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bt4s6" event={"ID":"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6","Type":"ContainerDied","Data":"f5f1ae71b717cdc730140beaf1ca65d2ddcf46f608c3d80a7944271cab646f4b"} Oct 10 08:15:32 crc kubenswrapper[4732]: I1010 08:15:32.359935 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bt4s6" Oct 10 08:15:32 crc kubenswrapper[4732]: I1010 08:15:32.498042 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc8f5\" (UniqueName: \"kubernetes.io/projected/fb5a352b-80c6-4aa1-b84c-f065c6d3ced6-kube-api-access-vc8f5\") pod \"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6\" (UID: \"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6\") " Oct 10 08:15:32 crc kubenswrapper[4732]: I1010 08:15:32.504918 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5a352b-80c6-4aa1-b84c-f065c6d3ced6-kube-api-access-vc8f5" (OuterVolumeSpecName: "kube-api-access-vc8f5") pod "fb5a352b-80c6-4aa1-b84c-f065c6d3ced6" (UID: "fb5a352b-80c6-4aa1-b84c-f065c6d3ced6"). InnerVolumeSpecName "kube-api-access-vc8f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:15:32 crc kubenswrapper[4732]: I1010 08:15:32.600588 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc8f5\" (UniqueName: \"kubernetes.io/projected/fb5a352b-80c6-4aa1-b84c-f065c6d3ced6-kube-api-access-vc8f5\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:32 crc kubenswrapper[4732]: I1010 08:15:32.986871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bt4s6" event={"ID":"fb5a352b-80c6-4aa1-b84c-f065c6d3ced6","Type":"ContainerDied","Data":"59c5826a130559e0dd9a52c82a8cf1ded5e82a12aa362a84ad97ae1b350fa635"} Oct 10 08:15:32 crc kubenswrapper[4732]: I1010 08:15:32.986929 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59c5826a130559e0dd9a52c82a8cf1ded5e82a12aa362a84ad97ae1b350fa635" Oct 10 08:15:32 crc kubenswrapper[4732]: I1010 08:15:32.986958 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bt4s6" Oct 10 08:15:38 crc kubenswrapper[4732]: I1010 08:15:38.840868 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3a60-account-create-k5pfm"] Oct 10 08:15:38 crc kubenswrapper[4732]: E1010 08:15:38.841841 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5a352b-80c6-4aa1-b84c-f065c6d3ced6" containerName="mariadb-database-create" Oct 10 08:15:38 crc kubenswrapper[4732]: I1010 08:15:38.841860 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5a352b-80c6-4aa1-b84c-f065c6d3ced6" containerName="mariadb-database-create" Oct 10 08:15:38 crc kubenswrapper[4732]: I1010 08:15:38.842046 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5a352b-80c6-4aa1-b84c-f065c6d3ced6" containerName="mariadb-database-create" Oct 10 08:15:38 crc kubenswrapper[4732]: I1010 08:15:38.842724 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3a60-account-create-k5pfm" Oct 10 08:15:38 crc kubenswrapper[4732]: I1010 08:15:38.844633 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 10 08:15:38 crc kubenswrapper[4732]: I1010 08:15:38.855341 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3a60-account-create-k5pfm"] Oct 10 08:15:38 crc kubenswrapper[4732]: I1010 08:15:38.932209 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5hs\" (UniqueName: \"kubernetes.io/projected/2fa17fdc-7931-4225-bf16-3da2631cad83-kube-api-access-9v5hs\") pod \"keystone-3a60-account-create-k5pfm\" (UID: \"2fa17fdc-7931-4225-bf16-3da2631cad83\") " pod="openstack/keystone-3a60-account-create-k5pfm" Oct 10 08:15:39 crc kubenswrapper[4732]: I1010 08:15:39.035025 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5hs\" (UniqueName: \"kubernetes.io/projected/2fa17fdc-7931-4225-bf16-3da2631cad83-kube-api-access-9v5hs\") pod \"keystone-3a60-account-create-k5pfm\" (UID: \"2fa17fdc-7931-4225-bf16-3da2631cad83\") " pod="openstack/keystone-3a60-account-create-k5pfm" Oct 10 08:15:39 crc kubenswrapper[4732]: I1010 08:15:39.065983 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5hs\" (UniqueName: \"kubernetes.io/projected/2fa17fdc-7931-4225-bf16-3da2631cad83-kube-api-access-9v5hs\") pod \"keystone-3a60-account-create-k5pfm\" (UID: \"2fa17fdc-7931-4225-bf16-3da2631cad83\") " pod="openstack/keystone-3a60-account-create-k5pfm" Oct 10 08:15:39 crc kubenswrapper[4732]: I1010 08:15:39.088516 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 10 08:15:39 crc kubenswrapper[4732]: I1010 08:15:39.178415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3a60-account-create-k5pfm" Oct 10 08:15:39 crc kubenswrapper[4732]: I1010 08:15:39.563864 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3a60-account-create-k5pfm"] Oct 10 08:15:39 crc kubenswrapper[4732]: I1010 08:15:39.662830 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:15:39 crc kubenswrapper[4732]: E1010 08:15:39.665273 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:15:40 crc kubenswrapper[4732]: I1010 08:15:40.070872 4732 generic.go:334] "Generic (PLEG): container finished" podID="2fa17fdc-7931-4225-bf16-3da2631cad83" containerID="f459d4958359919152e0b6dffe9e375e60b549fc946215270aae98f9f46ee5fd" exitCode=0 Oct 10 08:15:40 crc kubenswrapper[4732]: I1010 08:15:40.070932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3a60-account-create-k5pfm" event={"ID":"2fa17fdc-7931-4225-bf16-3da2631cad83","Type":"ContainerDied","Data":"f459d4958359919152e0b6dffe9e375e60b549fc946215270aae98f9f46ee5fd"} Oct 10 08:15:40 crc kubenswrapper[4732]: I1010 08:15:40.071160 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3a60-account-create-k5pfm" event={"ID":"2fa17fdc-7931-4225-bf16-3da2631cad83","Type":"ContainerStarted","Data":"0086e10fc4853aed3db0dd02bfa47bf3d95bfd89f3f6faaa322ffdb5981f5c24"} Oct 10 08:15:41 crc kubenswrapper[4732]: I1010 08:15:41.462779 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3a60-account-create-k5pfm" Oct 10 08:15:41 crc kubenswrapper[4732]: I1010 08:15:41.573171 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v5hs\" (UniqueName: \"kubernetes.io/projected/2fa17fdc-7931-4225-bf16-3da2631cad83-kube-api-access-9v5hs\") pod \"2fa17fdc-7931-4225-bf16-3da2631cad83\" (UID: \"2fa17fdc-7931-4225-bf16-3da2631cad83\") " Oct 10 08:15:41 crc kubenswrapper[4732]: I1010 08:15:41.580248 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa17fdc-7931-4225-bf16-3da2631cad83-kube-api-access-9v5hs" (OuterVolumeSpecName: "kube-api-access-9v5hs") pod "2fa17fdc-7931-4225-bf16-3da2631cad83" (UID: "2fa17fdc-7931-4225-bf16-3da2631cad83"). InnerVolumeSpecName "kube-api-access-9v5hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:15:41 crc kubenswrapper[4732]: I1010 08:15:41.674805 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v5hs\" (UniqueName: \"kubernetes.io/projected/2fa17fdc-7931-4225-bf16-3da2631cad83-kube-api-access-9v5hs\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:42 crc kubenswrapper[4732]: I1010 08:15:42.086472 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3a60-account-create-k5pfm" event={"ID":"2fa17fdc-7931-4225-bf16-3da2631cad83","Type":"ContainerDied","Data":"0086e10fc4853aed3db0dd02bfa47bf3d95bfd89f3f6faaa322ffdb5981f5c24"} Oct 10 08:15:42 crc kubenswrapper[4732]: I1010 08:15:42.086510 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0086e10fc4853aed3db0dd02bfa47bf3d95bfd89f3f6faaa322ffdb5981f5c24" Oct 10 08:15:42 crc kubenswrapper[4732]: I1010 08:15:42.086560 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3a60-account-create-k5pfm" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.206244 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-klcgc"] Oct 10 08:15:44 crc kubenswrapper[4732]: E1010 08:15:44.206977 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fa17fdc-7931-4225-bf16-3da2631cad83" containerName="mariadb-account-create" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.206998 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fa17fdc-7931-4225-bf16-3da2631cad83" containerName="mariadb-account-create" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.207175 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fa17fdc-7931-4225-bf16-3da2631cad83" containerName="mariadb-account-create" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.207715 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.209768 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wh6h4" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.210456 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.214987 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.215013 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-klcgc"] Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.216373 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.336068 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-config-data\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.336143 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-combined-ca-bundle\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.336187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhxv\" (UniqueName: \"kubernetes.io/projected/66b02470-428a-42f3-8327-2994102a0c89-kube-api-access-hwhxv\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.438111 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-config-data\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.438194 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-combined-ca-bundle\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.438228 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwhxv\" (UniqueName: \"kubernetes.io/projected/66b02470-428a-42f3-8327-2994102a0c89-kube-api-access-hwhxv\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.448735 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-config-data\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.449415 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-combined-ca-bundle\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.458653 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwhxv\" (UniqueName: \"kubernetes.io/projected/66b02470-428a-42f3-8327-2994102a0c89-kube-api-access-hwhxv\") pod \"keystone-db-sync-klcgc\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:44 crc kubenswrapper[4732]: I1010 08:15:44.542638 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:45 crc kubenswrapper[4732]: W1010 08:15:45.075916 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66b02470_428a_42f3_8327_2994102a0c89.slice/crio-2a1763a30d61db2f69c018a87a4c9245bbc6303b1811c9706bda34bcd1219862 WatchSource:0}: Error finding container 2a1763a30d61db2f69c018a87a4c9245bbc6303b1811c9706bda34bcd1219862: Status 404 returned error can't find the container with id 2a1763a30d61db2f69c018a87a4c9245bbc6303b1811c9706bda34bcd1219862 Oct 10 08:15:45 crc kubenswrapper[4732]: I1010 08:15:45.076070 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-klcgc"] Oct 10 08:15:45 crc kubenswrapper[4732]: I1010 08:15:45.111980 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-klcgc" event={"ID":"66b02470-428a-42f3-8327-2994102a0c89","Type":"ContainerStarted","Data":"2a1763a30d61db2f69c018a87a4c9245bbc6303b1811c9706bda34bcd1219862"} Oct 10 08:15:50 crc kubenswrapper[4732]: I1010 08:15:50.170044 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-klcgc" event={"ID":"66b02470-428a-42f3-8327-2994102a0c89","Type":"ContainerStarted","Data":"2e64371bce8a8dbe6d35b45260ed30f9714a72ef683d4a641af2b0375f67e291"} Oct 10 08:15:50 crc kubenswrapper[4732]: I1010 08:15:50.196456 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-klcgc" podStartSLOduration=1.594402664 podStartE2EDuration="6.19643078s" podCreationTimestamp="2025-10-10 08:15:44 +0000 UTC" firstStartedPulling="2025-10-10 08:15:45.079446762 +0000 UTC m=+5072.149038033" lastFinishedPulling="2025-10-10 08:15:49.681474868 +0000 UTC m=+5076.751066149" observedRunningTime="2025-10-10 08:15:50.188513917 +0000 UTC m=+5077.258105158" watchObservedRunningTime="2025-10-10 08:15:50.19643078 +0000 UTC m=+5077.266022061" Oct 10 08:15:51 crc kubenswrapper[4732]: I1010 08:15:51.661738 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:15:51 crc kubenswrapper[4732]: E1010 08:15:51.662401 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:15:52 crc kubenswrapper[4732]: I1010 08:15:52.185289 4732 generic.go:334] "Generic (PLEG): container finished" podID="66b02470-428a-42f3-8327-2994102a0c89" containerID="2e64371bce8a8dbe6d35b45260ed30f9714a72ef683d4a641af2b0375f67e291" exitCode=0 Oct 10 08:15:52 crc kubenswrapper[4732]: I1010 08:15:52.185333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-klcgc" event={"ID":"66b02470-428a-42f3-8327-2994102a0c89","Type":"ContainerDied","Data":"2e64371bce8a8dbe6d35b45260ed30f9714a72ef683d4a641af2b0375f67e291"} Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.562202 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.725773 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-combined-ca-bundle\") pod \"66b02470-428a-42f3-8327-2994102a0c89\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.726495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwhxv\" (UniqueName: \"kubernetes.io/projected/66b02470-428a-42f3-8327-2994102a0c89-kube-api-access-hwhxv\") pod \"66b02470-428a-42f3-8327-2994102a0c89\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.726942 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-config-data\") pod \"66b02470-428a-42f3-8327-2994102a0c89\" (UID: \"66b02470-428a-42f3-8327-2994102a0c89\") " Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.732992 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b02470-428a-42f3-8327-2994102a0c89-kube-api-access-hwhxv" (OuterVolumeSpecName: "kube-api-access-hwhxv") pod "66b02470-428a-42f3-8327-2994102a0c89" (UID: "66b02470-428a-42f3-8327-2994102a0c89"). InnerVolumeSpecName "kube-api-access-hwhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.769313 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66b02470-428a-42f3-8327-2994102a0c89" (UID: "66b02470-428a-42f3-8327-2994102a0c89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.797779 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-config-data" (OuterVolumeSpecName: "config-data") pod "66b02470-428a-42f3-8327-2994102a0c89" (UID: "66b02470-428a-42f3-8327-2994102a0c89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.830096 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.830896 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwhxv\" (UniqueName: \"kubernetes.io/projected/66b02470-428a-42f3-8327-2994102a0c89-kube-api-access-hwhxv\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:53 crc kubenswrapper[4732]: I1010 08:15:53.830931 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66b02470-428a-42f3-8327-2994102a0c89-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.209202 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-klcgc" event={"ID":"66b02470-428a-42f3-8327-2994102a0c89","Type":"ContainerDied","Data":"2a1763a30d61db2f69c018a87a4c9245bbc6303b1811c9706bda34bcd1219862"} Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.209286 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a1763a30d61db2f69c018a87a4c9245bbc6303b1811c9706bda34bcd1219862" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.209319 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-klcgc" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.490145 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6597979d97-ftrb6"] Oct 10 08:15:54 crc kubenswrapper[4732]: E1010 08:15:54.490572 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b02470-428a-42f3-8327-2994102a0c89" containerName="keystone-db-sync" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.490595 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b02470-428a-42f3-8327-2994102a0c89" containerName="keystone-db-sync" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.490832 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b02470-428a-42f3-8327-2994102a0c89" containerName="keystone-db-sync" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.492176 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.496195 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tqcfv"] Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.497496 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.500070 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.500115 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.500321 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.500384 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wh6h4" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.510042 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6597979d97-ftrb6"] Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.517952 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tqcfv"] Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.549803 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-sb\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.549939 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-nb\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.550033 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-dns-svc\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.550090 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-config\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.550227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnhtx\" (UniqueName: \"kubernetes.io/projected/6639fe3d-70b3-4f26-828e-e2946f744bac-kube-api-access-rnhtx\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnhtx\" (UniqueName: \"kubernetes.io/projected/6639fe3d-70b3-4f26-828e-e2946f744bac-kube-api-access-rnhtx\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651613 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-fernet-keys\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651653 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-sb\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651677 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g727\" (UniqueName: \"kubernetes.io/projected/387d3f15-0515-4ae1-aeba-4d63464f53b5-kube-api-access-9g727\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651715 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-credential-keys\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-combined-ca-bundle\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651757 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-scripts\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651803 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-config-data\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-nb\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-dns-svc\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.651876 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-config\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.652751 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-config\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.653348 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-nb\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.654367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-dns-svc\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.655003 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-sb\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.673881 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnhtx\" (UniqueName: \"kubernetes.io/projected/6639fe3d-70b3-4f26-828e-e2946f744bac-kube-api-access-rnhtx\") pod \"dnsmasq-dns-6597979d97-ftrb6\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.753159 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-config-data\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.753567 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-fernet-keys\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.753680 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g727\" (UniqueName: \"kubernetes.io/projected/387d3f15-0515-4ae1-aeba-4d63464f53b5-kube-api-access-9g727\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.753754 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-credential-keys\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.753814 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-combined-ca-bundle\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.753852 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-scripts\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.757058 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-config-data\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.757065 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-credential-keys\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.758246 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-combined-ca-bundle\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.759197 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-fernet-keys\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.762920 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-scripts\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.774753 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g727\" (UniqueName: \"kubernetes.io/projected/387d3f15-0515-4ae1-aeba-4d63464f53b5-kube-api-access-9g727\") pod \"keystone-bootstrap-tqcfv\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.822278 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:54 crc kubenswrapper[4732]: I1010 08:15:54.836601 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:15:55 crc kubenswrapper[4732]: I1010 08:15:55.173138 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6597979d97-ftrb6"] Oct 10 08:15:55 crc kubenswrapper[4732]: W1010 08:15:55.177290 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6639fe3d_70b3_4f26_828e_e2946f744bac.slice/crio-e1ecc745affa6e5110acb49eaf94768e6392e4497fb1c4edabad57ae0fd2603e WatchSource:0}: Error finding container e1ecc745affa6e5110acb49eaf94768e6392e4497fb1c4edabad57ae0fd2603e: Status 404 returned error can't find the container with id e1ecc745affa6e5110acb49eaf94768e6392e4497fb1c4edabad57ae0fd2603e Oct 10 08:15:55 crc kubenswrapper[4732]: I1010 08:15:55.218206 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" event={"ID":"6639fe3d-70b3-4f26-828e-e2946f744bac","Type":"ContainerStarted","Data":"e1ecc745affa6e5110acb49eaf94768e6392e4497fb1c4edabad57ae0fd2603e"} Oct 10 08:15:55 crc kubenswrapper[4732]: I1010 08:15:55.456752 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tqcfv"] Oct 10 08:15:56 crc kubenswrapper[4732]: I1010 08:15:56.228591 4732 generic.go:334] "Generic (PLEG): container finished" podID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerID="130635c705584a6ff742f83511d184d7dbd792213d67b66f9dbc4aa4c2469ce3" exitCode=0 Oct 10 08:15:56 crc kubenswrapper[4732]: I1010 08:15:56.228682 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" event={"ID":"6639fe3d-70b3-4f26-828e-e2946f744bac","Type":"ContainerDied","Data":"130635c705584a6ff742f83511d184d7dbd792213d67b66f9dbc4aa4c2469ce3"} Oct 10 08:15:56 crc kubenswrapper[4732]: I1010 08:15:56.230887 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tqcfv" event={"ID":"387d3f15-0515-4ae1-aeba-4d63464f53b5","Type":"ContainerStarted","Data":"a80153ef60ebeca56a582cd838e49cc895bce0cff31faf651b12bea8db2e1f0d"} Oct 10 08:15:56 crc kubenswrapper[4732]: I1010 08:15:56.231007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tqcfv" event={"ID":"387d3f15-0515-4ae1-aeba-4d63464f53b5","Type":"ContainerStarted","Data":"90f8757290df646c77e04373f002a57ec0f5398859a556ef04120746294f1a35"} Oct 10 08:15:56 crc kubenswrapper[4732]: I1010 08:15:56.269283 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tqcfv" podStartSLOduration=2.269268996 podStartE2EDuration="2.269268996s" podCreationTimestamp="2025-10-10 08:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:15:56.264072886 +0000 UTC m=+5083.333664137" watchObservedRunningTime="2025-10-10 08:15:56.269268996 +0000 UTC m=+5083.338860227" Oct 10 08:15:57 crc kubenswrapper[4732]: I1010 08:15:57.240255 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" event={"ID":"6639fe3d-70b3-4f26-828e-e2946f744bac","Type":"ContainerStarted","Data":"cbd6f412292be551b93a3c5d5a9813f155f2251c5359c4252c4cf5f0804709a6"} Oct 10 08:15:57 crc kubenswrapper[4732]: I1010 08:15:57.240713 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:15:57 crc kubenswrapper[4732]: I1010 08:15:57.271339 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" podStartSLOduration=3.271317537 podStartE2EDuration="3.271317537s" podCreationTimestamp="2025-10-10 08:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:15:57.263925338 +0000 UTC m=+5084.333516579" watchObservedRunningTime="2025-10-10 08:15:57.271317537 +0000 UTC m=+5084.340908778" Oct 10 08:15:59 crc kubenswrapper[4732]: I1010 08:15:59.260249 4732 generic.go:334] "Generic (PLEG): container finished" podID="387d3f15-0515-4ae1-aeba-4d63464f53b5" containerID="a80153ef60ebeca56a582cd838e49cc895bce0cff31faf651b12bea8db2e1f0d" exitCode=0 Oct 10 08:15:59 crc kubenswrapper[4732]: I1010 08:15:59.260332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tqcfv" event={"ID":"387d3f15-0515-4ae1-aeba-4d63464f53b5","Type":"ContainerDied","Data":"a80153ef60ebeca56a582cd838e49cc895bce0cff31faf651b12bea8db2e1f0d"} Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.648273 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.784737 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-config-data\") pod \"387d3f15-0515-4ae1-aeba-4d63464f53b5\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.784859 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g727\" (UniqueName: \"kubernetes.io/projected/387d3f15-0515-4ae1-aeba-4d63464f53b5-kube-api-access-9g727\") pod \"387d3f15-0515-4ae1-aeba-4d63464f53b5\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.784951 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-credential-keys\") pod \"387d3f15-0515-4ae1-aeba-4d63464f53b5\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.785085 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-scripts\") pod \"387d3f15-0515-4ae1-aeba-4d63464f53b5\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.785144 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-combined-ca-bundle\") pod \"387d3f15-0515-4ae1-aeba-4d63464f53b5\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.785217 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-fernet-keys\") pod \"387d3f15-0515-4ae1-aeba-4d63464f53b5\" (UID: \"387d3f15-0515-4ae1-aeba-4d63464f53b5\") " Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.795480 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-scripts" (OuterVolumeSpecName: "scripts") pod "387d3f15-0515-4ae1-aeba-4d63464f53b5" (UID: "387d3f15-0515-4ae1-aeba-4d63464f53b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.796648 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "387d3f15-0515-4ae1-aeba-4d63464f53b5" (UID: "387d3f15-0515-4ae1-aeba-4d63464f53b5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.797465 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387d3f15-0515-4ae1-aeba-4d63464f53b5-kube-api-access-9g727" (OuterVolumeSpecName: "kube-api-access-9g727") pod "387d3f15-0515-4ae1-aeba-4d63464f53b5" (UID: "387d3f15-0515-4ae1-aeba-4d63464f53b5"). InnerVolumeSpecName "kube-api-access-9g727". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.801042 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "387d3f15-0515-4ae1-aeba-4d63464f53b5" (UID: "387d3f15-0515-4ae1-aeba-4d63464f53b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.821578 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "387d3f15-0515-4ae1-aeba-4d63464f53b5" (UID: "387d3f15-0515-4ae1-aeba-4d63464f53b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.830744 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-config-data" (OuterVolumeSpecName: "config-data") pod "387d3f15-0515-4ae1-aeba-4d63464f53b5" (UID: "387d3f15-0515-4ae1-aeba-4d63464f53b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.887669 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.887722 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.887732 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.887740 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.887750 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/387d3f15-0515-4ae1-aeba-4d63464f53b5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:00 crc kubenswrapper[4732]: I1010 08:16:00.887758 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g727\" (UniqueName: \"kubernetes.io/projected/387d3f15-0515-4ae1-aeba-4d63464f53b5-kube-api-access-9g727\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.294175 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tqcfv" event={"ID":"387d3f15-0515-4ae1-aeba-4d63464f53b5","Type":"ContainerDied","Data":"90f8757290df646c77e04373f002a57ec0f5398859a556ef04120746294f1a35"} Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.294236 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90f8757290df646c77e04373f002a57ec0f5398859a556ef04120746294f1a35" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.294317 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tqcfv" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.371244 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tqcfv"] Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.378480 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tqcfv"] Oct 10 08:16:01 crc kubenswrapper[4732]: E1010 08:16:01.403372 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod387d3f15_0515_4ae1_aeba_4d63464f53b5.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.461943 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lzxlw"] Oct 10 08:16:01 crc kubenswrapper[4732]: E1010 08:16:01.462252 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387d3f15-0515-4ae1-aeba-4d63464f53b5" containerName="keystone-bootstrap" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.462277 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="387d3f15-0515-4ae1-aeba-4d63464f53b5" containerName="keystone-bootstrap" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.462448 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="387d3f15-0515-4ae1-aeba-4d63464f53b5" containerName="keystone-bootstrap" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.463006 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.466401 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.466546 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.466549 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.466995 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wh6h4" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.481588 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lzxlw"] Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.605963 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-scripts\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.606381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-config-data\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.606604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swbbq\" (UniqueName: \"kubernetes.io/projected/8872e039-09c6-47bc-8ce5-1c512f861997-kube-api-access-swbbq\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.606803 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-fernet-keys\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.607036 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-combined-ca-bundle\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.607196 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-credential-keys\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.672732 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="387d3f15-0515-4ae1-aeba-4d63464f53b5" path="/var/lib/kubelet/pods/387d3f15-0515-4ae1-aeba-4d63464f53b5/volumes" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.709063 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-scripts\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.709118 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-config-data\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.709148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swbbq\" (UniqueName: \"kubernetes.io/projected/8872e039-09c6-47bc-8ce5-1c512f861997-kube-api-access-swbbq\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.709167 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-fernet-keys\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.709213 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-combined-ca-bundle\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.709234 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-credential-keys\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.715818 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-scripts\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.716505 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-fernet-keys\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.716539 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-credential-keys\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.716616 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-combined-ca-bundle\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.721968 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-config-data\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.744687 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swbbq\" (UniqueName: \"kubernetes.io/projected/8872e039-09c6-47bc-8ce5-1c512f861997-kube-api-access-swbbq\") pod \"keystone-bootstrap-lzxlw\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:01 crc kubenswrapper[4732]: I1010 08:16:01.782812 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:02 crc kubenswrapper[4732]: I1010 08:16:02.317782 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lzxlw"] Oct 10 08:16:03 crc kubenswrapper[4732]: I1010 08:16:03.330185 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lzxlw" event={"ID":"8872e039-09c6-47bc-8ce5-1c512f861997","Type":"ContainerStarted","Data":"7026f96cdd6e572a52269789f1415e148f0c260105369d185998641f19a91abf"} Oct 10 08:16:03 crc kubenswrapper[4732]: I1010 08:16:03.330390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lzxlw" event={"ID":"8872e039-09c6-47bc-8ce5-1c512f861997","Type":"ContainerStarted","Data":"001f23e8034c38102b1715174d33e6c9f5a341c3fae7fb38127f4a7972816aa6"} Oct 10 08:16:03 crc kubenswrapper[4732]: I1010 08:16:03.373553 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lzxlw" podStartSLOduration=2.373516425 podStartE2EDuration="2.373516425s" podCreationTimestamp="2025-10-10 08:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:16:03.353180686 +0000 UTC m=+5090.422771967" watchObservedRunningTime="2025-10-10 08:16:03.373516425 +0000 UTC m=+5090.443107736" Oct 10 08:16:04 crc kubenswrapper[4732]: I1010 08:16:04.825029 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:16:04 crc kubenswrapper[4732]: I1010 08:16:04.910882 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c7f8f76f-vk6wv"] Oct 10 08:16:04 crc kubenswrapper[4732]: I1010 08:16:04.911262 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" podUID="615c5712-3935-437d-bd82-3b70be8299df" containerName="dnsmasq-dns" containerID="cri-o://e469593e2af201ad667e13f8d566d1d6d25fbd584a1ba65098870622281670ff" gracePeriod=10 Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.380554 4732 generic.go:334] "Generic (PLEG): container finished" podID="615c5712-3935-437d-bd82-3b70be8299df" containerID="e469593e2af201ad667e13f8d566d1d6d25fbd584a1ba65098870622281670ff" exitCode=0 Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.380656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" event={"ID":"615c5712-3935-437d-bd82-3b70be8299df","Type":"ContainerDied","Data":"e469593e2af201ad667e13f8d566d1d6d25fbd584a1ba65098870622281670ff"} Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.381039 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" event={"ID":"615c5712-3935-437d-bd82-3b70be8299df","Type":"ContainerDied","Data":"42bfcc8347387036955be15eddbd06887d45ae1807681f1136d585ee1adf2eaa"} Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.381065 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42bfcc8347387036955be15eddbd06887d45ae1807681f1136d585ee1adf2eaa" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.394801 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.480198 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-config\") pod \"615c5712-3935-437d-bd82-3b70be8299df\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.480274 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-dns-svc\") pod \"615c5712-3935-437d-bd82-3b70be8299df\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.480440 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-nb\") pod \"615c5712-3935-437d-bd82-3b70be8299df\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.480490 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-sb\") pod \"615c5712-3935-437d-bd82-3b70be8299df\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.480521 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbpqf\" (UniqueName: \"kubernetes.io/projected/615c5712-3935-437d-bd82-3b70be8299df-kube-api-access-xbpqf\") pod \"615c5712-3935-437d-bd82-3b70be8299df\" (UID: \"615c5712-3935-437d-bd82-3b70be8299df\") " Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.495885 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615c5712-3935-437d-bd82-3b70be8299df-kube-api-access-xbpqf" (OuterVolumeSpecName: "kube-api-access-xbpqf") pod "615c5712-3935-437d-bd82-3b70be8299df" (UID: "615c5712-3935-437d-bd82-3b70be8299df"). InnerVolumeSpecName "kube-api-access-xbpqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.522332 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "615c5712-3935-437d-bd82-3b70be8299df" (UID: "615c5712-3935-437d-bd82-3b70be8299df"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.525421 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "615c5712-3935-437d-bd82-3b70be8299df" (UID: "615c5712-3935-437d-bd82-3b70be8299df"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.526272 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "615c5712-3935-437d-bd82-3b70be8299df" (UID: "615c5712-3935-437d-bd82-3b70be8299df"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.532147 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-config" (OuterVolumeSpecName: "config") pod "615c5712-3935-437d-bd82-3b70be8299df" (UID: "615c5712-3935-437d-bd82-3b70be8299df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.581955 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.581986 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbpqf\" (UniqueName: \"kubernetes.io/projected/615c5712-3935-437d-bd82-3b70be8299df-kube-api-access-xbpqf\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.581996 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.582005 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.582013 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/615c5712-3935-437d-bd82-3b70be8299df-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:05 crc kubenswrapper[4732]: I1010 08:16:05.660338 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:16:05 crc kubenswrapper[4732]: E1010 08:16:05.661091 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:16:06 crc kubenswrapper[4732]: I1010 08:16:06.399172 4732 generic.go:334] "Generic (PLEG): container finished" podID="8872e039-09c6-47bc-8ce5-1c512f861997" containerID="7026f96cdd6e572a52269789f1415e148f0c260105369d185998641f19a91abf" exitCode=0 Oct 10 08:16:06 crc kubenswrapper[4732]: I1010 08:16:06.399300 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59c7f8f76f-vk6wv" Oct 10 08:16:06 crc kubenswrapper[4732]: I1010 08:16:06.399298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lzxlw" event={"ID":"8872e039-09c6-47bc-8ce5-1c512f861997","Type":"ContainerDied","Data":"7026f96cdd6e572a52269789f1415e148f0c260105369d185998641f19a91abf"} Oct 10 08:16:06 crc kubenswrapper[4732]: I1010 08:16:06.423879 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59c7f8f76f-vk6wv"] Oct 10 08:16:06 crc kubenswrapper[4732]: I1010 08:16:06.432419 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59c7f8f76f-vk6wv"] Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.669832 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615c5712-3935-437d-bd82-3b70be8299df" path="/var/lib/kubelet/pods/615c5712-3935-437d-bd82-3b70be8299df/volumes" Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.816792 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.923952 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-combined-ca-bundle\") pod \"8872e039-09c6-47bc-8ce5-1c512f861997\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.924029 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-config-data\") pod \"8872e039-09c6-47bc-8ce5-1c512f861997\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.924076 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-credential-keys\") pod \"8872e039-09c6-47bc-8ce5-1c512f861997\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.924132 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-fernet-keys\") pod \"8872e039-09c6-47bc-8ce5-1c512f861997\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.924173 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swbbq\" (UniqueName: \"kubernetes.io/projected/8872e039-09c6-47bc-8ce5-1c512f861997-kube-api-access-swbbq\") pod \"8872e039-09c6-47bc-8ce5-1c512f861997\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.924216 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-scripts\") pod \"8872e039-09c6-47bc-8ce5-1c512f861997\" (UID: \"8872e039-09c6-47bc-8ce5-1c512f861997\") " Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.930069 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8872e039-09c6-47bc-8ce5-1c512f861997" (UID: "8872e039-09c6-47bc-8ce5-1c512f861997"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.931032 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8872e039-09c6-47bc-8ce5-1c512f861997-kube-api-access-swbbq" (OuterVolumeSpecName: "kube-api-access-swbbq") pod "8872e039-09c6-47bc-8ce5-1c512f861997" (UID: "8872e039-09c6-47bc-8ce5-1c512f861997"). InnerVolumeSpecName "kube-api-access-swbbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.931624 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8872e039-09c6-47bc-8ce5-1c512f861997" (UID: "8872e039-09c6-47bc-8ce5-1c512f861997"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.935878 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-scripts" (OuterVolumeSpecName: "scripts") pod "8872e039-09c6-47bc-8ce5-1c512f861997" (UID: "8872e039-09c6-47bc-8ce5-1c512f861997"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.965080 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8872e039-09c6-47bc-8ce5-1c512f861997" (UID: "8872e039-09c6-47bc-8ce5-1c512f861997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:07 crc kubenswrapper[4732]: I1010 08:16:07.965788 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-config-data" (OuterVolumeSpecName: "config-data") pod "8872e039-09c6-47bc-8ce5-1c512f861997" (UID: "8872e039-09c6-47bc-8ce5-1c512f861997"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.026442 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.026510 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.026521 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.026529 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.026538 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swbbq\" (UniqueName: \"kubernetes.io/projected/8872e039-09c6-47bc-8ce5-1c512f861997-kube-api-access-swbbq\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.026548 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8872e039-09c6-47bc-8ce5-1c512f861997-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.417735 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lzxlw" event={"ID":"8872e039-09c6-47bc-8ce5-1c512f861997","Type":"ContainerDied","Data":"001f23e8034c38102b1715174d33e6c9f5a341c3fae7fb38127f4a7972816aa6"} Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.418083 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="001f23e8034c38102b1715174d33e6c9f5a341c3fae7fb38127f4a7972816aa6" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.417984 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lzxlw" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.546034 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-777778bb9f-g6hc7"] Oct 10 08:16:08 crc kubenswrapper[4732]: E1010 08:16:08.546593 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615c5712-3935-437d-bd82-3b70be8299df" containerName="dnsmasq-dns" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.546666 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="615c5712-3935-437d-bd82-3b70be8299df" containerName="dnsmasq-dns" Oct 10 08:16:08 crc kubenswrapper[4732]: E1010 08:16:08.546747 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615c5712-3935-437d-bd82-3b70be8299df" containerName="init" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.546759 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="615c5712-3935-437d-bd82-3b70be8299df" containerName="init" Oct 10 08:16:08 crc kubenswrapper[4732]: E1010 08:16:08.546861 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8872e039-09c6-47bc-8ce5-1c512f861997" containerName="keystone-bootstrap" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.547803 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8872e039-09c6-47bc-8ce5-1c512f861997" containerName="keystone-bootstrap" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.549325 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="615c5712-3935-437d-bd82-3b70be8299df" containerName="dnsmasq-dns" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.549364 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8872e039-09c6-47bc-8ce5-1c512f861997" containerName="keystone-bootstrap" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.549992 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.554092 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.554283 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.554905 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.555024 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.555198 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.555239 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-wh6h4" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.555242 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-777778bb9f-g6hc7"] Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.644607 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-config-data\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.644661 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-fernet-keys\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.644732 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-scripts\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.644885 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-combined-ca-bundle\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.644988 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-credential-keys\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.645007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-public-tls-certs\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.645101 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbhp9\" (UniqueName: \"kubernetes.io/projected/8b09380c-07fd-4b37-93bd-c8c44f496ae4-kube-api-access-gbhp9\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.645229 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-internal-tls-certs\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-config-data\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746668 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-fernet-keys\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-scripts\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-combined-ca-bundle\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746861 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-credential-keys\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746884 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-public-tls-certs\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbhp9\" (UniqueName: \"kubernetes.io/projected/8b09380c-07fd-4b37-93bd-c8c44f496ae4-kube-api-access-gbhp9\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.746981 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-internal-tls-certs\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.750396 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-credential-keys\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.751351 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-public-tls-certs\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.752243 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-internal-tls-certs\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.753440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-fernet-keys\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.759787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-config-data\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.768112 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-scripts\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.768141 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b09380c-07fd-4b37-93bd-c8c44f496ae4-combined-ca-bundle\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.772045 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbhp9\" (UniqueName: \"kubernetes.io/projected/8b09380c-07fd-4b37-93bd-c8c44f496ae4-kube-api-access-gbhp9\") pod \"keystone-777778bb9f-g6hc7\" (UID: \"8b09380c-07fd-4b37-93bd-c8c44f496ae4\") " pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:08 crc kubenswrapper[4732]: I1010 08:16:08.894330 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:09 crc kubenswrapper[4732]: I1010 08:16:09.373855 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-777778bb9f-g6hc7"] Oct 10 08:16:09 crc kubenswrapper[4732]: I1010 08:16:09.426621 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-777778bb9f-g6hc7" event={"ID":"8b09380c-07fd-4b37-93bd-c8c44f496ae4","Type":"ContainerStarted","Data":"ea6a062cd2b8d85c3678b6278692435ea1d715a94036d804ef3807b1f1d7cb5c"} Oct 10 08:16:10 crc kubenswrapper[4732]: I1010 08:16:10.436932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-777778bb9f-g6hc7" event={"ID":"8b09380c-07fd-4b37-93bd-c8c44f496ae4","Type":"ContainerStarted","Data":"ded8bce95779e2e55dfd52f4f6d2e2400daa745866d8a7edd982352582f486d5"} Oct 10 08:16:10 crc kubenswrapper[4732]: I1010 08:16:10.437355 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:10 crc kubenswrapper[4732]: I1010 08:16:10.465322 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-777778bb9f-g6hc7" podStartSLOduration=2.465301497 podStartE2EDuration="2.465301497s" podCreationTimestamp="2025-10-10 08:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:16:10.458621127 +0000 UTC m=+5097.528212408" watchObservedRunningTime="2025-10-10 08:16:10.465301497 +0000 UTC m=+5097.534892738" Oct 10 08:16:17 crc kubenswrapper[4732]: I1010 08:16:17.660447 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:16:17 crc kubenswrapper[4732]: E1010 08:16:17.661130 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:16:25 crc kubenswrapper[4732]: I1010 08:16:25.368508 4732 scope.go:117] "RemoveContainer" containerID="c661315cbd2cde3f8cd2ee28430c91cb06e63ecd70d90c43366b70c562d1940e" Oct 10 08:16:29 crc kubenswrapper[4732]: I1010 08:16:29.668520 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:16:29 crc kubenswrapper[4732]: E1010 08:16:29.669984 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:16:40 crc kubenswrapper[4732]: I1010 08:16:40.416744 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-777778bb9f-g6hc7" Oct 10 08:16:42 crc kubenswrapper[4732]: I1010 08:16:42.661053 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:16:42 crc kubenswrapper[4732]: E1010 08:16:42.662157 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:16:43 crc kubenswrapper[4732]: I1010 08:16:43.970541 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 08:16:43 crc kubenswrapper[4732]: I1010 08:16:43.973217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:16:43 crc kubenswrapper[4732]: I1010 08:16:43.979346 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 10 08:16:43 crc kubenswrapper[4732]: I1010 08:16:43.979641 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6qbb8" Oct 10 08:16:43 crc kubenswrapper[4732]: I1010 08:16:43.980057 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 10 08:16:43 crc kubenswrapper[4732]: I1010 08:16:43.992368 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.008619 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.008761 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqgp7\" (UniqueName: \"kubernetes.io/projected/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-kube-api-access-xqgp7\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.008836 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.009112 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.110932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqgp7\" (UniqueName: \"kubernetes.io/projected/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-kube-api-access-xqgp7\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.111007 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.111104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.111237 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.112468 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.117991 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config-secret\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.118417 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.133005 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqgp7\" (UniqueName: \"kubernetes.io/projected/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-kube-api-access-xqgp7\") pod \"openstackclient\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.315343 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:16:44 crc kubenswrapper[4732]: I1010 08:16:44.817754 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:16:45 crc kubenswrapper[4732]: I1010 08:16:45.778641 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da789cd8-70ea-4eeb-85ce-b0fc33468b8d","Type":"ContainerStarted","Data":"4e9ffd4aef435575c7d9a557a5371d50a8bacb574e1503a5466aae1176db2dcd"} Oct 10 08:16:57 crc kubenswrapper[4732]: I1010 08:16:57.660987 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:16:57 crc kubenswrapper[4732]: E1010 08:16:57.662576 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:16:57 crc kubenswrapper[4732]: I1010 08:16:57.885966 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da789cd8-70ea-4eeb-85ce-b0fc33468b8d","Type":"ContainerStarted","Data":"c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d"} Oct 10 08:17:11 crc kubenswrapper[4732]: I1010 08:17:11.660337 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:17:11 crc kubenswrapper[4732]: E1010 08:17:11.661104 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:17:24 crc kubenswrapper[4732]: I1010 08:17:24.660987 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:17:24 crc kubenswrapper[4732]: E1010 08:17:24.661784 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:17:25 crc kubenswrapper[4732]: I1010 08:17:25.474163 4732 scope.go:117] "RemoveContainer" containerID="be0d3e89eaf1e00cf40e929f2f77727a5d41dea8f54d52917e4a2026af0efef2" Oct 10 08:17:25 crc kubenswrapper[4732]: I1010 08:17:25.507480 4732 scope.go:117] "RemoveContainer" containerID="8df65e0cacecef098dd26b10eba67b84a0946664edec8a0d70f2a7dd9037cabc" Oct 10 08:17:25 crc kubenswrapper[4732]: I1010 08:17:25.548458 4732 scope.go:117] "RemoveContainer" containerID="1d4e2cf8adad45949ecbfcd79d98d5f00081a858bed077f5351890df31b71549" Oct 10 08:17:25 crc kubenswrapper[4732]: I1010 08:17:25.594628 4732 scope.go:117] "RemoveContainer" containerID="2a3b22601a53e91620ec3a3d0fc638a17592e7470d620efd1f62f3cc628814aa" Oct 10 08:17:25 crc kubenswrapper[4732]: I1010 08:17:25.650401 4732 scope.go:117] "RemoveContainer" containerID="33037f3859291d7fa36d46805b0d806e2d0a7d5a3b74bed1d4e9f3c6d6d27c41" Oct 10 08:17:37 crc kubenswrapper[4732]: I1010 08:17:37.660965 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:17:37 crc kubenswrapper[4732]: E1010 08:17:37.661644 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:17:49 crc kubenswrapper[4732]: I1010 08:17:49.660220 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:17:49 crc kubenswrapper[4732]: E1010 08:17:49.661160 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:18:01 crc kubenswrapper[4732]: I1010 08:18:01.660901 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:18:01 crc kubenswrapper[4732]: E1010 08:18:01.661988 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:18:15 crc kubenswrapper[4732]: I1010 08:18:15.661004 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:18:15 crc kubenswrapper[4732]: E1010 08:18:15.662378 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:18:16 crc kubenswrapper[4732]: I1010 08:18:16.692259 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=81.80499858 podStartE2EDuration="1m33.692237328s" podCreationTimestamp="2025-10-10 08:16:43 +0000 UTC" firstStartedPulling="2025-10-10 08:16:44.829256096 +0000 UTC m=+5131.898847347" lastFinishedPulling="2025-10-10 08:16:56.716494844 +0000 UTC m=+5143.786086095" observedRunningTime="2025-10-10 08:16:57.910607169 +0000 UTC m=+5144.980198430" watchObservedRunningTime="2025-10-10 08:18:16.692237328 +0000 UTC m=+5223.761828569" Oct 10 08:18:16 crc kubenswrapper[4732]: I1010 08:18:16.699387 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cnq2d"] Oct 10 08:18:16 crc kubenswrapper[4732]: I1010 08:18:16.700411 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnq2d" Oct 10 08:18:16 crc kubenswrapper[4732]: I1010 08:18:16.717023 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cnq2d"] Oct 10 08:18:16 crc kubenswrapper[4732]: I1010 08:18:16.720821 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btvtv\" (UniqueName: \"kubernetes.io/projected/a4475e71-36e3-4a0e-a144-a511d65d1cc3-kube-api-access-btvtv\") pod \"barbican-db-create-cnq2d\" (UID: \"a4475e71-36e3-4a0e-a144-a511d65d1cc3\") " pod="openstack/barbican-db-create-cnq2d" Oct 10 08:18:16 crc kubenswrapper[4732]: I1010 08:18:16.823224 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btvtv\" (UniqueName: \"kubernetes.io/projected/a4475e71-36e3-4a0e-a144-a511d65d1cc3-kube-api-access-btvtv\") pod \"barbican-db-create-cnq2d\" (UID: \"a4475e71-36e3-4a0e-a144-a511d65d1cc3\") " pod="openstack/barbican-db-create-cnq2d" Oct 10 08:18:16 crc kubenswrapper[4732]: I1010 08:18:16.861034 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btvtv\" (UniqueName: \"kubernetes.io/projected/a4475e71-36e3-4a0e-a144-a511d65d1cc3-kube-api-access-btvtv\") pod \"barbican-db-create-cnq2d\" (UID: \"a4475e71-36e3-4a0e-a144-a511d65d1cc3\") " pod="openstack/barbican-db-create-cnq2d" Oct 10 08:18:17 crc kubenswrapper[4732]: I1010 08:18:17.022130 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnq2d" Oct 10 08:18:17 crc kubenswrapper[4732]: I1010 08:18:17.511988 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cnq2d"] Oct 10 08:18:17 crc kubenswrapper[4732]: I1010 08:18:17.682333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnq2d" event={"ID":"a4475e71-36e3-4a0e-a144-a511d65d1cc3","Type":"ContainerStarted","Data":"d4011bdfc9aa029233e0b51781d9ec60b05c6fdd7b0f69cc216f0bd387211354"} Oct 10 08:18:18 crc kubenswrapper[4732]: I1010 08:18:18.674597 4732 generic.go:334] "Generic (PLEG): container finished" podID="a4475e71-36e3-4a0e-a144-a511d65d1cc3" containerID="5792f40aa29781e9a05d2f0f406ead89f46122612db3918775f9d6e56dc300ae" exitCode=0 Oct 10 08:18:18 crc kubenswrapper[4732]: I1010 08:18:18.674719 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnq2d" event={"ID":"a4475e71-36e3-4a0e-a144-a511d65d1cc3","Type":"ContainerDied","Data":"5792f40aa29781e9a05d2f0f406ead89f46122612db3918775f9d6e56dc300ae"} Oct 10 08:18:20 crc kubenswrapper[4732]: I1010 08:18:20.006344 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnq2d" Oct 10 08:18:20 crc kubenswrapper[4732]: I1010 08:18:20.084870 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btvtv\" (UniqueName: \"kubernetes.io/projected/a4475e71-36e3-4a0e-a144-a511d65d1cc3-kube-api-access-btvtv\") pod \"a4475e71-36e3-4a0e-a144-a511d65d1cc3\" (UID: \"a4475e71-36e3-4a0e-a144-a511d65d1cc3\") " Oct 10 08:18:20 crc kubenswrapper[4732]: I1010 08:18:20.093049 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4475e71-36e3-4a0e-a144-a511d65d1cc3-kube-api-access-btvtv" (OuterVolumeSpecName: "kube-api-access-btvtv") pod "a4475e71-36e3-4a0e-a144-a511d65d1cc3" (UID: "a4475e71-36e3-4a0e-a144-a511d65d1cc3"). InnerVolumeSpecName "kube-api-access-btvtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:18:20 crc kubenswrapper[4732]: I1010 08:18:20.189183 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btvtv\" (UniqueName: \"kubernetes.io/projected/a4475e71-36e3-4a0e-a144-a511d65d1cc3-kube-api-access-btvtv\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:20 crc kubenswrapper[4732]: I1010 08:18:20.698025 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cnq2d" event={"ID":"a4475e71-36e3-4a0e-a144-a511d65d1cc3","Type":"ContainerDied","Data":"d4011bdfc9aa029233e0b51781d9ec60b05c6fdd7b0f69cc216f0bd387211354"} Oct 10 08:18:20 crc kubenswrapper[4732]: I1010 08:18:20.698062 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4011bdfc9aa029233e0b51781d9ec60b05c6fdd7b0f69cc216f0bd387211354" Oct 10 08:18:20 crc kubenswrapper[4732]: I1010 08:18:20.698121 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cnq2d" Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.660207 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.714158 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-84b1-account-create-llrv9"] Oct 10 08:18:26 crc kubenswrapper[4732]: E1010 08:18:26.714797 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4475e71-36e3-4a0e-a144-a511d65d1cc3" containerName="mariadb-database-create" Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.714815 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4475e71-36e3-4a0e-a144-a511d65d1cc3" containerName="mariadb-database-create" Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.715074 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4475e71-36e3-4a0e-a144-a511d65d1cc3" containerName="mariadb-database-create" Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.715970 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-84b1-account-create-llrv9" Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.718643 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.746159 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-84b1-account-create-llrv9"] Oct 10 08:18:26 crc kubenswrapper[4732]: I1010 08:18:26.904002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv86b\" (UniqueName: \"kubernetes.io/projected/391767a0-caac-4153-a01e-34aaebc47b86-kube-api-access-bv86b\") pod \"barbican-84b1-account-create-llrv9\" (UID: \"391767a0-caac-4153-a01e-34aaebc47b86\") " pod="openstack/barbican-84b1-account-create-llrv9" Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.005621 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv86b\" (UniqueName: \"kubernetes.io/projected/391767a0-caac-4153-a01e-34aaebc47b86-kube-api-access-bv86b\") pod \"barbican-84b1-account-create-llrv9\" (UID: \"391767a0-caac-4153-a01e-34aaebc47b86\") " pod="openstack/barbican-84b1-account-create-llrv9" Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.032452 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv86b\" (UniqueName: \"kubernetes.io/projected/391767a0-caac-4153-a01e-34aaebc47b86-kube-api-access-bv86b\") pod \"barbican-84b1-account-create-llrv9\" (UID: \"391767a0-caac-4153-a01e-34aaebc47b86\") " pod="openstack/barbican-84b1-account-create-llrv9" Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.091457 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-84b1-account-create-llrv9" Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.544115 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-84b1-account-create-llrv9"] Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.768176 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-84b1-account-create-llrv9" event={"ID":"391767a0-caac-4153-a01e-34aaebc47b86","Type":"ContainerStarted","Data":"c6dd858f58c9642fc5947e750c6d8fc01af9df9d9cd71c143ce003e66d641b17"} Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.768521 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-84b1-account-create-llrv9" event={"ID":"391767a0-caac-4153-a01e-34aaebc47b86","Type":"ContainerStarted","Data":"58c0e0e4adf658ad4a047cd9ff195568070ca5cf7ce285e98e68ddd8c094dc50"} Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.770808 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"7df48300028f267e40178e485796865eb5f10b524ed9fcd0a9aaeef67e08b38f"} Oct 10 08:18:27 crc kubenswrapper[4732]: I1010 08:18:27.786062 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-84b1-account-create-llrv9" podStartSLOduration=1.786045543 podStartE2EDuration="1.786045543s" podCreationTimestamp="2025-10-10 08:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:18:27.78110075 +0000 UTC m=+5234.850692001" watchObservedRunningTime="2025-10-10 08:18:27.786045543 +0000 UTC m=+5234.855636774" Oct 10 08:18:28 crc kubenswrapper[4732]: I1010 08:18:28.788890 4732 generic.go:334] "Generic (PLEG): container finished" podID="391767a0-caac-4153-a01e-34aaebc47b86" containerID="c6dd858f58c9642fc5947e750c6d8fc01af9df9d9cd71c143ce003e66d641b17" exitCode=0 Oct 10 08:18:28 crc kubenswrapper[4732]: I1010 08:18:28.788982 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-84b1-account-create-llrv9" event={"ID":"391767a0-caac-4153-a01e-34aaebc47b86","Type":"ContainerDied","Data":"c6dd858f58c9642fc5947e750c6d8fc01af9df9d9cd71c143ce003e66d641b17"} Oct 10 08:18:30 crc kubenswrapper[4732]: I1010 08:18:30.181432 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-84b1-account-create-llrv9" Oct 10 08:18:30 crc kubenswrapper[4732]: I1010 08:18:30.364327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv86b\" (UniqueName: \"kubernetes.io/projected/391767a0-caac-4153-a01e-34aaebc47b86-kube-api-access-bv86b\") pod \"391767a0-caac-4153-a01e-34aaebc47b86\" (UID: \"391767a0-caac-4153-a01e-34aaebc47b86\") " Oct 10 08:18:30 crc kubenswrapper[4732]: I1010 08:18:30.371121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/391767a0-caac-4153-a01e-34aaebc47b86-kube-api-access-bv86b" (OuterVolumeSpecName: "kube-api-access-bv86b") pod "391767a0-caac-4153-a01e-34aaebc47b86" (UID: "391767a0-caac-4153-a01e-34aaebc47b86"). InnerVolumeSpecName "kube-api-access-bv86b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:18:30 crc kubenswrapper[4732]: I1010 08:18:30.466236 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv86b\" (UniqueName: \"kubernetes.io/projected/391767a0-caac-4153-a01e-34aaebc47b86-kube-api-access-bv86b\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:30 crc kubenswrapper[4732]: I1010 08:18:30.811714 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-84b1-account-create-llrv9" event={"ID":"391767a0-caac-4153-a01e-34aaebc47b86","Type":"ContainerDied","Data":"58c0e0e4adf658ad4a047cd9ff195568070ca5cf7ce285e98e68ddd8c094dc50"} Oct 10 08:18:30 crc kubenswrapper[4732]: I1010 08:18:30.811777 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58c0e0e4adf658ad4a047cd9ff195568070ca5cf7ce285e98e68ddd8c094dc50" Oct 10 08:18:30 crc kubenswrapper[4732]: I1010 08:18:30.811785 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-84b1-account-create-llrv9" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.016286 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gwn4b"] Oct 10 08:18:32 crc kubenswrapper[4732]: E1010 08:18:32.017245 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="391767a0-caac-4153-a01e-34aaebc47b86" containerName="mariadb-account-create" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.017269 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="391767a0-caac-4153-a01e-34aaebc47b86" containerName="mariadb-account-create" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.017622 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="391767a0-caac-4153-a01e-34aaebc47b86" containerName="mariadb-account-create" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.022817 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.027800 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.027927 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qcmpm" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.038684 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gwn4b"] Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.092263 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7zzs\" (UniqueName: \"kubernetes.io/projected/e539916a-97c8-4b30-9422-1b96bb610b3b-kube-api-access-h7zzs\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.092514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-combined-ca-bundle\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.092773 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-db-sync-config-data\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.193472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7zzs\" (UniqueName: \"kubernetes.io/projected/e539916a-97c8-4b30-9422-1b96bb610b3b-kube-api-access-h7zzs\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.193593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-combined-ca-bundle\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.193659 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-db-sync-config-data\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.200633 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-combined-ca-bundle\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.201184 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-db-sync-config-data\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.210764 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7zzs\" (UniqueName: \"kubernetes.io/projected/e539916a-97c8-4b30-9422-1b96bb610b3b-kube-api-access-h7zzs\") pod \"barbican-db-sync-gwn4b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.358983 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.816948 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gwn4b"] Oct 10 08:18:32 crc kubenswrapper[4732]: I1010 08:18:32.829408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwn4b" event={"ID":"e539916a-97c8-4b30-9422-1b96bb610b3b","Type":"ContainerStarted","Data":"d8ecadc7b7961237bafd7cf331e762d5948e8c86d2d9290d04a30d52f4aed428"} Oct 10 08:18:37 crc kubenswrapper[4732]: I1010 08:18:37.876025 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwn4b" event={"ID":"e539916a-97c8-4b30-9422-1b96bb610b3b","Type":"ContainerStarted","Data":"0532bb3ac7d72c03abf990dcab0bab9cf8006d75f64b010a4ca5b97f3457e0fa"} Oct 10 08:18:37 crc kubenswrapper[4732]: I1010 08:18:37.913737 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gwn4b" podStartSLOduration=2.982567184 podStartE2EDuration="6.913708234s" podCreationTimestamp="2025-10-10 08:18:31 +0000 UTC" firstStartedPulling="2025-10-10 08:18:32.822827979 +0000 UTC m=+5239.892419220" lastFinishedPulling="2025-10-10 08:18:36.753969029 +0000 UTC m=+5243.823560270" observedRunningTime="2025-10-10 08:18:37.899159511 +0000 UTC m=+5244.968750812" watchObservedRunningTime="2025-10-10 08:18:37.913708234 +0000 UTC m=+5244.983299505" Oct 10 08:18:38 crc kubenswrapper[4732]: I1010 08:18:38.889140 4732 generic.go:334] "Generic (PLEG): container finished" podID="e539916a-97c8-4b30-9422-1b96bb610b3b" containerID="0532bb3ac7d72c03abf990dcab0bab9cf8006d75f64b010a4ca5b97f3457e0fa" exitCode=0 Oct 10 08:18:38 crc kubenswrapper[4732]: I1010 08:18:38.889234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwn4b" event={"ID":"e539916a-97c8-4b30-9422-1b96bb610b3b","Type":"ContainerDied","Data":"0532bb3ac7d72c03abf990dcab0bab9cf8006d75f64b010a4ca5b97f3457e0fa"} Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.222590 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.345803 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7zzs\" (UniqueName: \"kubernetes.io/projected/e539916a-97c8-4b30-9422-1b96bb610b3b-kube-api-access-h7zzs\") pod \"e539916a-97c8-4b30-9422-1b96bb610b3b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.345985 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-db-sync-config-data\") pod \"e539916a-97c8-4b30-9422-1b96bb610b3b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.346024 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-combined-ca-bundle\") pod \"e539916a-97c8-4b30-9422-1b96bb610b3b\" (UID: \"e539916a-97c8-4b30-9422-1b96bb610b3b\") " Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.352754 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e539916a-97c8-4b30-9422-1b96bb610b3b" (UID: "e539916a-97c8-4b30-9422-1b96bb610b3b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.353434 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e539916a-97c8-4b30-9422-1b96bb610b3b-kube-api-access-h7zzs" (OuterVolumeSpecName: "kube-api-access-h7zzs") pod "e539916a-97c8-4b30-9422-1b96bb610b3b" (UID: "e539916a-97c8-4b30-9422-1b96bb610b3b"). InnerVolumeSpecName "kube-api-access-h7zzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.375669 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e539916a-97c8-4b30-9422-1b96bb610b3b" (UID: "e539916a-97c8-4b30-9422-1b96bb610b3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.448203 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7zzs\" (UniqueName: \"kubernetes.io/projected/e539916a-97c8-4b30-9422-1b96bb610b3b-kube-api-access-h7zzs\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.448252 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.448265 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e539916a-97c8-4b30-9422-1b96bb610b3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.905471 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gwn4b" event={"ID":"e539916a-97c8-4b30-9422-1b96bb610b3b","Type":"ContainerDied","Data":"d8ecadc7b7961237bafd7cf331e762d5948e8c86d2d9290d04a30d52f4aed428"} Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.905520 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ecadc7b7961237bafd7cf331e762d5948e8c86d2d9290d04a30d52f4aed428" Oct 10 08:18:40 crc kubenswrapper[4732]: I1010 08:18:40.905581 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gwn4b" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.182960 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-67c968cc46-jcgx7"] Oct 10 08:18:41 crc kubenswrapper[4732]: E1010 08:18:41.183340 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e539916a-97c8-4b30-9422-1b96bb610b3b" containerName="barbican-db-sync" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.183360 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e539916a-97c8-4b30-9422-1b96bb610b3b" containerName="barbican-db-sync" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.183595 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e539916a-97c8-4b30-9422-1b96bb610b3b" containerName="barbican-db-sync" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.184745 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.190596 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.194473 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qcmpm" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.194490 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.208416 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67c968cc46-jcgx7"] Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.247669 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6765975-zgmjf"] Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.249408 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.252255 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.261867 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-combined-ca-bundle\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.261925 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-logs\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.261972 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfsg\" (UniqueName: \"kubernetes.io/projected/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-kube-api-access-2jfsg\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.262017 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-config-data\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.262054 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-config-data-custom\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.262244 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6765975-zgmjf"] Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-logs\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363513 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-config-data-custom\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363537 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-combined-ca-bundle\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363615 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5vj\" (UniqueName: \"kubernetes.io/projected/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-kube-api-access-zr5vj\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363763 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-combined-ca-bundle\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363828 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-config-data\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-logs\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.363955 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jfsg\" (UniqueName: \"kubernetes.io/projected/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-kube-api-access-2jfsg\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.364033 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-config-data\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.364062 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-config-data-custom\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.365506 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-logs\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.369375 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6888c7f469-lf5dv"] Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.371228 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.372903 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-config-data\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.381506 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-combined-ca-bundle\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.390432 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6888c7f469-lf5dv"] Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.391259 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-config-data-custom\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.393402 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jfsg\" (UniqueName: \"kubernetes.io/projected/7af9c216-845f-4a2a-b87c-28efa0bb0b8e-kube-api-access-2jfsg\") pod \"barbican-keystone-listener-67c968cc46-jcgx7\" (UID: \"7af9c216-845f-4a2a-b87c-28efa0bb0b8e\") " pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.444993 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-759fb46498-lbxxr"] Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-logs\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465578 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-config-data-custom\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465603 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-combined-ca-bundle\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465630 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-dns-svc\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfh7\" (UniqueName: \"kubernetes.io/projected/113b111b-bca6-49ea-9c64-7a903abecef8-kube-api-access-2lfh7\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-sb\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465710 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-config\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465734 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5vj\" (UniqueName: \"kubernetes.io/projected/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-kube-api-access-zr5vj\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465758 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-config-data\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465826 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-nb\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.465945 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-logs\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.467468 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759fb46498-lbxxr"] Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.467595 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.470135 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.472146 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-config-data\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.472219 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-combined-ca-bundle\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.476878 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-config-data-custom\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.491581 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5vj\" (UniqueName: \"kubernetes.io/projected/a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0-kube-api-access-zr5vj\") pod \"barbican-worker-6765975-zgmjf\" (UID: \"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0\") " pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.513044 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.567978 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-dns-svc\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfh7\" (UniqueName: \"kubernetes.io/projected/113b111b-bca6-49ea-9c64-7a903abecef8-kube-api-access-2lfh7\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568073 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-sb\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-combined-ca-bundle\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-config\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data-custom\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568216 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb764f1-80a2-484e-98b9-131360b0cabc-logs\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568298 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-nb\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568345 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.568392 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpjnd\" (UniqueName: \"kubernetes.io/projected/aeb764f1-80a2-484e-98b9-131360b0cabc-kube-api-access-dpjnd\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.569583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-dns-svc\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.571011 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-config\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.575359 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-nb\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.575561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-sb\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.580289 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6765975-zgmjf" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.590859 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfh7\" (UniqueName: \"kubernetes.io/projected/113b111b-bca6-49ea-9c64-7a903abecef8-kube-api-access-2lfh7\") pod \"dnsmasq-dns-6888c7f469-lf5dv\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.613888 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.669815 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb764f1-80a2-484e-98b9-131360b0cabc-logs\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.670054 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.670094 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpjnd\" (UniqueName: \"kubernetes.io/projected/aeb764f1-80a2-484e-98b9-131360b0cabc-kube-api-access-dpjnd\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.670135 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-combined-ca-bundle\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.670157 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data-custom\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.672659 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb764f1-80a2-484e-98b9-131360b0cabc-logs\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.681673 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.681887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-combined-ca-bundle\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.682921 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data-custom\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.685197 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpjnd\" (UniqueName: \"kubernetes.io/projected/aeb764f1-80a2-484e-98b9-131360b0cabc-kube-api-access-dpjnd\") pod \"barbican-api-759fb46498-lbxxr\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.935215 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:41 crc kubenswrapper[4732]: I1010 08:18:41.986713 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67c968cc46-jcgx7"] Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.172570 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6765975-zgmjf"] Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.183648 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6888c7f469-lf5dv"] Oct 10 08:18:42 crc kubenswrapper[4732]: W1010 08:18:42.186750 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod113b111b_bca6_49ea_9c64_7a903abecef8.slice/crio-9cc47f48f0909ffce09e6e90eb19057cadb8d3e5dd7f0bb8e2be4f00d79d9da7 WatchSource:0}: Error finding container 9cc47f48f0909ffce09e6e90eb19057cadb8d3e5dd7f0bb8e2be4f00d79d9da7: Status 404 returned error can't find the container with id 9cc47f48f0909ffce09e6e90eb19057cadb8d3e5dd7f0bb8e2be4f00d79d9da7 Oct 10 08:18:42 crc kubenswrapper[4732]: W1010 08:18:42.192036 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3c08da0_8c8b_4d8a_893d_b3d77a2acdc0.slice/crio-4535750eda0b3941aba229b7d8887b8bd0a7e54dba6931ff079887aa48bd3648 WatchSource:0}: Error finding container 4535750eda0b3941aba229b7d8887b8bd0a7e54dba6931ff079887aa48bd3648: Status 404 returned error can't find the container with id 4535750eda0b3941aba229b7d8887b8bd0a7e54dba6931ff079887aa48bd3648 Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.458739 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-759fb46498-lbxxr"] Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.922415 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" event={"ID":"7af9c216-845f-4a2a-b87c-28efa0bb0b8e","Type":"ContainerStarted","Data":"a52d0fc284fce5b65e89fbb4fb306e8d9129459818b5bbdcbac3148b3bed4332"} Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.924742 4732 generic.go:334] "Generic (PLEG): container finished" podID="113b111b-bca6-49ea-9c64-7a903abecef8" containerID="1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b" exitCode=0 Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.925005 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" event={"ID":"113b111b-bca6-49ea-9c64-7a903abecef8","Type":"ContainerDied","Data":"1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b"} Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.925064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" event={"ID":"113b111b-bca6-49ea-9c64-7a903abecef8","Type":"ContainerStarted","Data":"9cc47f48f0909ffce09e6e90eb19057cadb8d3e5dd7f0bb8e2be4f00d79d9da7"} Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.929722 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6765975-zgmjf" event={"ID":"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0","Type":"ContainerStarted","Data":"4535750eda0b3941aba229b7d8887b8bd0a7e54dba6931ff079887aa48bd3648"} Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.937419 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759fb46498-lbxxr" event={"ID":"aeb764f1-80a2-484e-98b9-131360b0cabc","Type":"ContainerStarted","Data":"886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7"} Oct 10 08:18:42 crc kubenswrapper[4732]: I1010 08:18:42.937498 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759fb46498-lbxxr" event={"ID":"aeb764f1-80a2-484e-98b9-131360b0cabc","Type":"ContainerStarted","Data":"c49064a804630891cbad6cc45e2bdd8122fc44504e4e03e28ec8007a53a5dae8"} Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.809617 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5cc7b7c678-fwj45"] Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.811392 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.813050 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.813055 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.831051 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cc7b7c678-fwj45"] Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.912537 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-logs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.912835 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-config-data-custom\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.912864 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-public-tls-certs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.912917 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sgh2\" (UniqueName: \"kubernetes.io/projected/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-kube-api-access-6sgh2\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.912939 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-internal-tls-certs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.912971 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-config-data\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.913043 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-combined-ca-bundle\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.954397 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759fb46498-lbxxr" event={"ID":"aeb764f1-80a2-484e-98b9-131360b0cabc","Type":"ContainerStarted","Data":"595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50"} Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.955612 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.955648 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:43 crc kubenswrapper[4732]: I1010 08:18:43.985892 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-759fb46498-lbxxr" podStartSLOduration=2.985873501 podStartE2EDuration="2.985873501s" podCreationTimestamp="2025-10-10 08:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:18:43.971561045 +0000 UTC m=+5251.041152306" watchObservedRunningTime="2025-10-10 08:18:43.985873501 +0000 UTC m=+5251.055464742" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.020187 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-config-data-custom\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.020272 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-public-tls-certs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.020337 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-internal-tls-certs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.020359 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sgh2\" (UniqueName: \"kubernetes.io/projected/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-kube-api-access-6sgh2\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.020413 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-config-data\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.020465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-combined-ca-bundle\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.020565 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-logs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.021100 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-logs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.027357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-combined-ca-bundle\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.027666 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-public-tls-certs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.028446 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-internal-tls-certs\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.033787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-config-data\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.038087 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-config-data-custom\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.040768 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sgh2\" (UniqueName: \"kubernetes.io/projected/e0ebdd0f-4d64-4973-bd61-1982ae84e68f-kube-api-access-6sgh2\") pod \"barbican-api-5cc7b7c678-fwj45\" (UID: \"e0ebdd0f-4d64-4973-bd61-1982ae84e68f\") " pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.130975 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.581496 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5cc7b7c678-fwj45"] Oct 10 08:18:44 crc kubenswrapper[4732]: W1010 08:18:44.585735 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0ebdd0f_4d64_4973_bd61_1982ae84e68f.slice/crio-9978f232dc42b74faf83b361c43f29fbd5654bd4dd976a10e09f7968d106c238 WatchSource:0}: Error finding container 9978f232dc42b74faf83b361c43f29fbd5654bd4dd976a10e09f7968d106c238: Status 404 returned error can't find the container with id 9978f232dc42b74faf83b361c43f29fbd5654bd4dd976a10e09f7968d106c238 Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.968931 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" event={"ID":"7af9c216-845f-4a2a-b87c-28efa0bb0b8e","Type":"ContainerStarted","Data":"98e50e2b8074e48aa81d5ed9fea775d0bcff86d4280c3234b087102857a2f17e"} Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.969448 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" event={"ID":"7af9c216-845f-4a2a-b87c-28efa0bb0b8e","Type":"ContainerStarted","Data":"8c80e6b40a746039d512a4606d9fdd81ac5f2a9a8c306d49062fbc5511dbc611"} Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.970798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cc7b7c678-fwj45" event={"ID":"e0ebdd0f-4d64-4973-bd61-1982ae84e68f","Type":"ContainerStarted","Data":"57810afdd606cad300ba70709b34d5a28b490fb4dea27124bfd99e96488e73a7"} Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.970832 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cc7b7c678-fwj45" event={"ID":"e0ebdd0f-4d64-4973-bd61-1982ae84e68f","Type":"ContainerStarted","Data":"9978f232dc42b74faf83b361c43f29fbd5654bd4dd976a10e09f7968d106c238"} Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.973512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" event={"ID":"113b111b-bca6-49ea-9c64-7a903abecef8","Type":"ContainerStarted","Data":"01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029"} Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.973998 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.977720 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6765975-zgmjf" event={"ID":"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0","Type":"ContainerStarted","Data":"b562adee7f6abdac07122cab75f73d51e547da605ae1776393a5646ceb894853"} Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.978647 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6765975-zgmjf" event={"ID":"a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0","Type":"ContainerStarted","Data":"7394a35847d19fcdd75bdfa7800069954b47958d4658a302ca080b468ba65380"} Oct 10 08:18:44 crc kubenswrapper[4732]: I1010 08:18:44.991628 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-67c968cc46-jcgx7" podStartSLOduration=2.077025813 podStartE2EDuration="3.991610562s" podCreationTimestamp="2025-10-10 08:18:41 +0000 UTC" firstStartedPulling="2025-10-10 08:18:42.001376626 +0000 UTC m=+5249.070967867" lastFinishedPulling="2025-10-10 08:18:43.915961375 +0000 UTC m=+5250.985552616" observedRunningTime="2025-10-10 08:18:44.982458655 +0000 UTC m=+5252.052049896" watchObservedRunningTime="2025-10-10 08:18:44.991610562 +0000 UTC m=+5252.061201803" Oct 10 08:18:45 crc kubenswrapper[4732]: I1010 08:18:45.022179 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6765975-zgmjf" podStartSLOduration=2.300897961 podStartE2EDuration="4.022153896s" podCreationTimestamp="2025-10-10 08:18:41 +0000 UTC" firstStartedPulling="2025-10-10 08:18:42.195602575 +0000 UTC m=+5249.265193816" lastFinishedPulling="2025-10-10 08:18:43.91685851 +0000 UTC m=+5250.986449751" observedRunningTime="2025-10-10 08:18:45.010318557 +0000 UTC m=+5252.079909798" watchObservedRunningTime="2025-10-10 08:18:45.022153896 +0000 UTC m=+5252.091745137" Oct 10 08:18:45 crc kubenswrapper[4732]: I1010 08:18:45.048390 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" podStartSLOduration=4.048369963 podStartE2EDuration="4.048369963s" podCreationTimestamp="2025-10-10 08:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:18:45.043501792 +0000 UTC m=+5252.113093063" watchObservedRunningTime="2025-10-10 08:18:45.048369963 +0000 UTC m=+5252.117961204" Oct 10 08:18:45 crc kubenswrapper[4732]: I1010 08:18:45.987028 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5cc7b7c678-fwj45" event={"ID":"e0ebdd0f-4d64-4973-bd61-1982ae84e68f","Type":"ContainerStarted","Data":"29d9902879d6ca3eefbc3bf9a5b4cc27eaa8bb4246afe3b86910b69253885ffe"} Oct 10 08:18:45 crc kubenswrapper[4732]: I1010 08:18:45.987530 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:45 crc kubenswrapper[4732]: I1010 08:18:45.987562 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:46 crc kubenswrapper[4732]: I1010 08:18:46.010659 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5cc7b7c678-fwj45" podStartSLOduration=3.010641752 podStartE2EDuration="3.010641752s" podCreationTimestamp="2025-10-10 08:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:18:46.008930286 +0000 UTC m=+5253.078521537" watchObservedRunningTime="2025-10-10 08:18:46.010641752 +0000 UTC m=+5253.080232993" Oct 10 08:18:50 crc kubenswrapper[4732]: I1010 08:18:50.611207 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:50 crc kubenswrapper[4732]: I1010 08:18:50.612759 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5cc7b7c678-fwj45" Oct 10 08:18:50 crc kubenswrapper[4732]: I1010 08:18:50.749622 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-759fb46498-lbxxr"] Oct 10 08:18:50 crc kubenswrapper[4732]: I1010 08:18:50.750104 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-759fb46498-lbxxr" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api-log" containerID="cri-o://886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7" gracePeriod=30 Oct 10 08:18:50 crc kubenswrapper[4732]: I1010 08:18:50.750943 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-759fb46498-lbxxr" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api" containerID="cri-o://595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50" gracePeriod=30 Oct 10 08:18:50 crc kubenswrapper[4732]: I1010 08:18:50.757448 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759fb46498-lbxxr" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.33:9311/healthcheck\": EOF" Oct 10 08:18:50 crc kubenswrapper[4732]: I1010 08:18:50.763042 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759fb46498-lbxxr" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.33:9311/healthcheck\": EOF" Oct 10 08:18:51 crc kubenswrapper[4732]: I1010 08:18:51.035280 4732 generic.go:334] "Generic (PLEG): container finished" podID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerID="886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7" exitCode=143 Oct 10 08:18:51 crc kubenswrapper[4732]: I1010 08:18:51.035363 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759fb46498-lbxxr" event={"ID":"aeb764f1-80a2-484e-98b9-131360b0cabc","Type":"ContainerDied","Data":"886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7"} Oct 10 08:18:51 crc kubenswrapper[4732]: I1010 08:18:51.614940 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:18:51 crc kubenswrapper[4732]: I1010 08:18:51.688459 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6597979d97-ftrb6"] Oct 10 08:18:51 crc kubenswrapper[4732]: I1010 08:18:51.688749 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" podUID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerName="dnsmasq-dns" containerID="cri-o://cbd6f412292be551b93a3c5d5a9813f155f2251c5359c4252c4cf5f0804709a6" gracePeriod=10 Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.049535 4732 generic.go:334] "Generic (PLEG): container finished" podID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerID="cbd6f412292be551b93a3c5d5a9813f155f2251c5359c4252c4cf5f0804709a6" exitCode=0 Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.049588 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" event={"ID":"6639fe3d-70b3-4f26-828e-e2946f744bac","Type":"ContainerDied","Data":"cbd6f412292be551b93a3c5d5a9813f155f2251c5359c4252c4cf5f0804709a6"} Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.209495 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.272919 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-nb\") pod \"6639fe3d-70b3-4f26-828e-e2946f744bac\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.273097 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-config\") pod \"6639fe3d-70b3-4f26-828e-e2946f744bac\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.273126 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnhtx\" (UniqueName: \"kubernetes.io/projected/6639fe3d-70b3-4f26-828e-e2946f744bac-kube-api-access-rnhtx\") pod \"6639fe3d-70b3-4f26-828e-e2946f744bac\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.273800 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-sb\") pod \"6639fe3d-70b3-4f26-828e-e2946f744bac\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.273867 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-dns-svc\") pod \"6639fe3d-70b3-4f26-828e-e2946f744bac\" (UID: \"6639fe3d-70b3-4f26-828e-e2946f744bac\") " Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.291821 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6639fe3d-70b3-4f26-828e-e2946f744bac-kube-api-access-rnhtx" (OuterVolumeSpecName: "kube-api-access-rnhtx") pod "6639fe3d-70b3-4f26-828e-e2946f744bac" (UID: "6639fe3d-70b3-4f26-828e-e2946f744bac"). InnerVolumeSpecName "kube-api-access-rnhtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.320776 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6639fe3d-70b3-4f26-828e-e2946f744bac" (UID: "6639fe3d-70b3-4f26-828e-e2946f744bac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.321503 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-config" (OuterVolumeSpecName: "config") pod "6639fe3d-70b3-4f26-828e-e2946f744bac" (UID: "6639fe3d-70b3-4f26-828e-e2946f744bac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.323000 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6639fe3d-70b3-4f26-828e-e2946f744bac" (UID: "6639fe3d-70b3-4f26-828e-e2946f744bac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.343355 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6639fe3d-70b3-4f26-828e-e2946f744bac" (UID: "6639fe3d-70b3-4f26-828e-e2946f744bac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.375526 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnhtx\" (UniqueName: \"kubernetes.io/projected/6639fe3d-70b3-4f26-828e-e2946f744bac-kube-api-access-rnhtx\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.375885 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.375956 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.376016 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:52 crc kubenswrapper[4732]: I1010 08:18:52.376068 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6639fe3d-70b3-4f26-828e-e2946f744bac-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:53 crc kubenswrapper[4732]: I1010 08:18:53.064667 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" event={"ID":"6639fe3d-70b3-4f26-828e-e2946f744bac","Type":"ContainerDied","Data":"e1ecc745affa6e5110acb49eaf94768e6392e4497fb1c4edabad57ae0fd2603e"} Oct 10 08:18:53 crc kubenswrapper[4732]: I1010 08:18:53.064871 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6597979d97-ftrb6" Oct 10 08:18:53 crc kubenswrapper[4732]: I1010 08:18:53.065568 4732 scope.go:117] "RemoveContainer" containerID="cbd6f412292be551b93a3c5d5a9813f155f2251c5359c4252c4cf5f0804709a6" Oct 10 08:18:53 crc kubenswrapper[4732]: I1010 08:18:53.095329 4732 scope.go:117] "RemoveContainer" containerID="130635c705584a6ff742f83511d184d7dbd792213d67b66f9dbc4aa4c2469ce3" Oct 10 08:18:53 crc kubenswrapper[4732]: I1010 08:18:53.120120 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6597979d97-ftrb6"] Oct 10 08:18:53 crc kubenswrapper[4732]: I1010 08:18:53.134666 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6597979d97-ftrb6"] Oct 10 08:18:53 crc kubenswrapper[4732]: I1010 08:18:53.692253 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6639fe3d-70b3-4f26-828e-e2946f744bac" path="/var/lib/kubelet/pods/6639fe3d-70b3-4f26-828e-e2946f744bac/volumes" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.177059 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759fb46498-lbxxr" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.33:9311/healthcheck\": read tcp 10.217.0.2:58622->10.217.1.33:9311: read: connection reset by peer" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.177077 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-759fb46498-lbxxr" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.33:9311/healthcheck\": read tcp 10.217.0.2:58610->10.217.1.33:9311: read: connection reset by peer" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.657325 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.756744 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-combined-ca-bundle\") pod \"aeb764f1-80a2-484e-98b9-131360b0cabc\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.756860 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb764f1-80a2-484e-98b9-131360b0cabc-logs\") pod \"aeb764f1-80a2-484e-98b9-131360b0cabc\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.756961 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data-custom\") pod \"aeb764f1-80a2-484e-98b9-131360b0cabc\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.756997 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpjnd\" (UniqueName: \"kubernetes.io/projected/aeb764f1-80a2-484e-98b9-131360b0cabc-kube-api-access-dpjnd\") pod \"aeb764f1-80a2-484e-98b9-131360b0cabc\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.757359 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeb764f1-80a2-484e-98b9-131360b0cabc-logs" (OuterVolumeSpecName: "logs") pod "aeb764f1-80a2-484e-98b9-131360b0cabc" (UID: "aeb764f1-80a2-484e-98b9-131360b0cabc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.757737 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data\") pod \"aeb764f1-80a2-484e-98b9-131360b0cabc\" (UID: \"aeb764f1-80a2-484e-98b9-131360b0cabc\") " Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.758114 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeb764f1-80a2-484e-98b9-131360b0cabc-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.762243 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "aeb764f1-80a2-484e-98b9-131360b0cabc" (UID: "aeb764f1-80a2-484e-98b9-131360b0cabc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.762627 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeb764f1-80a2-484e-98b9-131360b0cabc-kube-api-access-dpjnd" (OuterVolumeSpecName: "kube-api-access-dpjnd") pod "aeb764f1-80a2-484e-98b9-131360b0cabc" (UID: "aeb764f1-80a2-484e-98b9-131360b0cabc"). InnerVolumeSpecName "kube-api-access-dpjnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.780711 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeb764f1-80a2-484e-98b9-131360b0cabc" (UID: "aeb764f1-80a2-484e-98b9-131360b0cabc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.805794 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data" (OuterVolumeSpecName: "config-data") pod "aeb764f1-80a2-484e-98b9-131360b0cabc" (UID: "aeb764f1-80a2-484e-98b9-131360b0cabc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.859959 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.859993 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.860003 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpjnd\" (UniqueName: \"kubernetes.io/projected/aeb764f1-80a2-484e-98b9-131360b0cabc-kube-api-access-dpjnd\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:56 crc kubenswrapper[4732]: I1010 08:18:56.860013 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeb764f1-80a2-484e-98b9-131360b0cabc-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.123617 4732 generic.go:334] "Generic (PLEG): container finished" podID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerID="595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50" exitCode=0 Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.123670 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759fb46498-lbxxr" event={"ID":"aeb764f1-80a2-484e-98b9-131360b0cabc","Type":"ContainerDied","Data":"595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50"} Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.123777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-759fb46498-lbxxr" event={"ID":"aeb764f1-80a2-484e-98b9-131360b0cabc","Type":"ContainerDied","Data":"c49064a804630891cbad6cc45e2bdd8122fc44504e4e03e28ec8007a53a5dae8"} Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.123806 4732 scope.go:117] "RemoveContainer" containerID="595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.124133 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-759fb46498-lbxxr" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.148000 4732 scope.go:117] "RemoveContainer" containerID="886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.159826 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-759fb46498-lbxxr"] Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.166427 4732 scope.go:117] "RemoveContainer" containerID="595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50" Oct 10 08:18:57 crc kubenswrapper[4732]: E1010 08:18:57.166972 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50\": container with ID starting with 595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50 not found: ID does not exist" containerID="595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.167013 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50"} err="failed to get container status \"595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50\": rpc error: code = NotFound desc = could not find container \"595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50\": container with ID starting with 595c74ad7c23d1a0688253c96456dc2196769a8fbe6c594e8b1ec6233d571f50 not found: ID does not exist" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.167039 4732 scope.go:117] "RemoveContainer" containerID="886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7" Oct 10 08:18:57 crc kubenswrapper[4732]: E1010 08:18:57.167393 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7\": container with ID starting with 886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7 not found: ID does not exist" containerID="886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.167437 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7"} err="failed to get container status \"886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7\": rpc error: code = NotFound desc = could not find container \"886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7\": container with ID starting with 886609a6cb44bd1ad0e5c21a556d05927b17d6468ad5818dfc1e5677647992a7 not found: ID does not exist" Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.169493 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-759fb46498-lbxxr"] Oct 10 08:18:57 crc kubenswrapper[4732]: I1010 08:18:57.675014 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" path="/var/lib/kubelet/pods/aeb764f1-80a2-484e-98b9-131360b0cabc/volumes" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.612550 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zcpf6"] Oct 10 08:19:30 crc kubenswrapper[4732]: E1010 08:19:30.613414 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerName="init" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.613428 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerName="init" Oct 10 08:19:30 crc kubenswrapper[4732]: E1010 08:19:30.613444 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api-log" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.613451 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api-log" Oct 10 08:19:30 crc kubenswrapper[4732]: E1010 08:19:30.613461 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.613468 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api" Oct 10 08:19:30 crc kubenswrapper[4732]: E1010 08:19:30.613486 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerName="dnsmasq-dns" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.613493 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerName="dnsmasq-dns" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.613715 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6639fe3d-70b3-4f26-828e-e2946f744bac" containerName="dnsmasq-dns" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.613734 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.613756 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeb764f1-80a2-484e-98b9-131360b0cabc" containerName="barbican-api-log" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.614376 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcpf6" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.621809 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zcpf6"] Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.728745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2s66\" (UniqueName: \"kubernetes.io/projected/3eb708b8-d5fa-425a-b1b2-919e632bb7b8-kube-api-access-w2s66\") pod \"neutron-db-create-zcpf6\" (UID: \"3eb708b8-d5fa-425a-b1b2-919e632bb7b8\") " pod="openstack/neutron-db-create-zcpf6" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.831423 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2s66\" (UniqueName: \"kubernetes.io/projected/3eb708b8-d5fa-425a-b1b2-919e632bb7b8-kube-api-access-w2s66\") pod \"neutron-db-create-zcpf6\" (UID: \"3eb708b8-d5fa-425a-b1b2-919e632bb7b8\") " pod="openstack/neutron-db-create-zcpf6" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.864426 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2s66\" (UniqueName: \"kubernetes.io/projected/3eb708b8-d5fa-425a-b1b2-919e632bb7b8-kube-api-access-w2s66\") pod \"neutron-db-create-zcpf6\" (UID: \"3eb708b8-d5fa-425a-b1b2-919e632bb7b8\") " pod="openstack/neutron-db-create-zcpf6" Oct 10 08:19:30 crc kubenswrapper[4732]: I1010 08:19:30.936160 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcpf6" Oct 10 08:19:31 crc kubenswrapper[4732]: I1010 08:19:31.255164 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zcpf6"] Oct 10 08:19:31 crc kubenswrapper[4732]: I1010 08:19:31.484913 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcpf6" event={"ID":"3eb708b8-d5fa-425a-b1b2-919e632bb7b8","Type":"ContainerStarted","Data":"050d265c2e176646e010776b89fb4b86e150efa09692e4d759afefbf99b908d9"} Oct 10 08:19:31 crc kubenswrapper[4732]: I1010 08:19:31.484966 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcpf6" event={"ID":"3eb708b8-d5fa-425a-b1b2-919e632bb7b8","Type":"ContainerStarted","Data":"7bcec1d720d1923247c3a0776ffede5848e4c0908eb328eb230f91b20137bd09"} Oct 10 08:19:31 crc kubenswrapper[4732]: I1010 08:19:31.502050 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-zcpf6" podStartSLOduration=1.50203094 podStartE2EDuration="1.50203094s" podCreationTimestamp="2025-10-10 08:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:19:31.500152249 +0000 UTC m=+5298.569743490" watchObservedRunningTime="2025-10-10 08:19:31.50203094 +0000 UTC m=+5298.571622191" Oct 10 08:19:32 crc kubenswrapper[4732]: I1010 08:19:32.499435 4732 generic.go:334] "Generic (PLEG): container finished" podID="3eb708b8-d5fa-425a-b1b2-919e632bb7b8" containerID="050d265c2e176646e010776b89fb4b86e150efa09692e4d759afefbf99b908d9" exitCode=0 Oct 10 08:19:32 crc kubenswrapper[4732]: I1010 08:19:32.499551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcpf6" event={"ID":"3eb708b8-d5fa-425a-b1b2-919e632bb7b8","Type":"ContainerDied","Data":"050d265c2e176646e010776b89fb4b86e150efa09692e4d759afefbf99b908d9"} Oct 10 08:19:33 crc kubenswrapper[4732]: I1010 08:19:33.876286 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcpf6" Oct 10 08:19:34 crc kubenswrapper[4732]: I1010 08:19:34.002451 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2s66\" (UniqueName: \"kubernetes.io/projected/3eb708b8-d5fa-425a-b1b2-919e632bb7b8-kube-api-access-w2s66\") pod \"3eb708b8-d5fa-425a-b1b2-919e632bb7b8\" (UID: \"3eb708b8-d5fa-425a-b1b2-919e632bb7b8\") " Oct 10 08:19:34 crc kubenswrapper[4732]: I1010 08:19:34.013369 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb708b8-d5fa-425a-b1b2-919e632bb7b8-kube-api-access-w2s66" (OuterVolumeSpecName: "kube-api-access-w2s66") pod "3eb708b8-d5fa-425a-b1b2-919e632bb7b8" (UID: "3eb708b8-d5fa-425a-b1b2-919e632bb7b8"). InnerVolumeSpecName "kube-api-access-w2s66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:19:34 crc kubenswrapper[4732]: I1010 08:19:34.104214 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2s66\" (UniqueName: \"kubernetes.io/projected/3eb708b8-d5fa-425a-b1b2-919e632bb7b8-kube-api-access-w2s66\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:34 crc kubenswrapper[4732]: I1010 08:19:34.524855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zcpf6" event={"ID":"3eb708b8-d5fa-425a-b1b2-919e632bb7b8","Type":"ContainerDied","Data":"7bcec1d720d1923247c3a0776ffede5848e4c0908eb328eb230f91b20137bd09"} Oct 10 08:19:34 crc kubenswrapper[4732]: I1010 08:19:34.524913 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bcec1d720d1923247c3a0776ffede5848e4c0908eb328eb230f91b20137bd09" Oct 10 08:19:34 crc kubenswrapper[4732]: I1010 08:19:34.525343 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zcpf6" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.756566 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8ec8-account-create-46pz2"] Oct 10 08:19:40 crc kubenswrapper[4732]: E1010 08:19:40.757443 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb708b8-d5fa-425a-b1b2-919e632bb7b8" containerName="mariadb-database-create" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.757457 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb708b8-d5fa-425a-b1b2-919e632bb7b8" containerName="mariadb-database-create" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.757668 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb708b8-d5fa-425a-b1b2-919e632bb7b8" containerName="mariadb-database-create" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.758279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8ec8-account-create-46pz2" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.760881 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.771495 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8ec8-account-create-46pz2"] Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.830664 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x56cq\" (UniqueName: \"kubernetes.io/projected/add98675-a136-495e-ac8d-45049b57e51b-kube-api-access-x56cq\") pod \"neutron-8ec8-account-create-46pz2\" (UID: \"add98675-a136-495e-ac8d-45049b57e51b\") " pod="openstack/neutron-8ec8-account-create-46pz2" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.931828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x56cq\" (UniqueName: \"kubernetes.io/projected/add98675-a136-495e-ac8d-45049b57e51b-kube-api-access-x56cq\") pod \"neutron-8ec8-account-create-46pz2\" (UID: \"add98675-a136-495e-ac8d-45049b57e51b\") " pod="openstack/neutron-8ec8-account-create-46pz2" Oct 10 08:19:40 crc kubenswrapper[4732]: I1010 08:19:40.950938 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x56cq\" (UniqueName: \"kubernetes.io/projected/add98675-a136-495e-ac8d-45049b57e51b-kube-api-access-x56cq\") pod \"neutron-8ec8-account-create-46pz2\" (UID: \"add98675-a136-495e-ac8d-45049b57e51b\") " pod="openstack/neutron-8ec8-account-create-46pz2" Oct 10 08:19:41 crc kubenswrapper[4732]: I1010 08:19:41.079217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8ec8-account-create-46pz2" Oct 10 08:19:41 crc kubenswrapper[4732]: W1010 08:19:41.531828 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadd98675_a136_495e_ac8d_45049b57e51b.slice/crio-dc5a66eefd014c0580d990cd060601e381de36daa47200d606996b0df5693686 WatchSource:0}: Error finding container dc5a66eefd014c0580d990cd060601e381de36daa47200d606996b0df5693686: Status 404 returned error can't find the container with id dc5a66eefd014c0580d990cd060601e381de36daa47200d606996b0df5693686 Oct 10 08:19:41 crc kubenswrapper[4732]: I1010 08:19:41.543657 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8ec8-account-create-46pz2"] Oct 10 08:19:41 crc kubenswrapper[4732]: I1010 08:19:41.606943 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8ec8-account-create-46pz2" event={"ID":"add98675-a136-495e-ac8d-45049b57e51b","Type":"ContainerStarted","Data":"dc5a66eefd014c0580d990cd060601e381de36daa47200d606996b0df5693686"} Oct 10 08:19:42 crc kubenswrapper[4732]: I1010 08:19:42.621363 4732 generic.go:334] "Generic (PLEG): container finished" podID="add98675-a136-495e-ac8d-45049b57e51b" containerID="6b4a0afd3b0af80b544e00e777b7a87827376c177ac04897493c68162479b41f" exitCode=0 Oct 10 08:19:42 crc kubenswrapper[4732]: I1010 08:19:42.621447 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8ec8-account-create-46pz2" event={"ID":"add98675-a136-495e-ac8d-45049b57e51b","Type":"ContainerDied","Data":"6b4a0afd3b0af80b544e00e777b7a87827376c177ac04897493c68162479b41f"} Oct 10 08:19:43 crc kubenswrapper[4732]: I1010 08:19:43.992047 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8ec8-account-create-46pz2" Oct 10 08:19:44 crc kubenswrapper[4732]: I1010 08:19:44.091774 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x56cq\" (UniqueName: \"kubernetes.io/projected/add98675-a136-495e-ac8d-45049b57e51b-kube-api-access-x56cq\") pod \"add98675-a136-495e-ac8d-45049b57e51b\" (UID: \"add98675-a136-495e-ac8d-45049b57e51b\") " Oct 10 08:19:44 crc kubenswrapper[4732]: I1010 08:19:44.097594 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add98675-a136-495e-ac8d-45049b57e51b-kube-api-access-x56cq" (OuterVolumeSpecName: "kube-api-access-x56cq") pod "add98675-a136-495e-ac8d-45049b57e51b" (UID: "add98675-a136-495e-ac8d-45049b57e51b"). InnerVolumeSpecName "kube-api-access-x56cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:19:44 crc kubenswrapper[4732]: I1010 08:19:44.193605 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x56cq\" (UniqueName: \"kubernetes.io/projected/add98675-a136-495e-ac8d-45049b57e51b-kube-api-access-x56cq\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:44 crc kubenswrapper[4732]: I1010 08:19:44.641255 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8ec8-account-create-46pz2" event={"ID":"add98675-a136-495e-ac8d-45049b57e51b","Type":"ContainerDied","Data":"dc5a66eefd014c0580d990cd060601e381de36daa47200d606996b0df5693686"} Oct 10 08:19:44 crc kubenswrapper[4732]: I1010 08:19:44.641317 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc5a66eefd014c0580d990cd060601e381de36daa47200d606996b0df5693686" Oct 10 08:19:44 crc kubenswrapper[4732]: I1010 08:19:44.641340 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8ec8-account-create-46pz2" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.076172 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qgnk9"] Oct 10 08:19:46 crc kubenswrapper[4732]: E1010 08:19:46.076975 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add98675-a136-495e-ac8d-45049b57e51b" containerName="mariadb-account-create" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.076997 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="add98675-a136-495e-ac8d-45049b57e51b" containerName="mariadb-account-create" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.077311 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="add98675-a136-495e-ac8d-45049b57e51b" containerName="mariadb-account-create" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.078273 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.080610 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.080657 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q8hww" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.080860 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.094533 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qgnk9"] Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.232613 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p5pk\" (UniqueName: \"kubernetes.io/projected/b458ddfb-e4b7-4348-bf92-57cfc0a37076-kube-api-access-6p5pk\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.232677 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-config\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.232742 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-combined-ca-bundle\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.334784 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p5pk\" (UniqueName: \"kubernetes.io/projected/b458ddfb-e4b7-4348-bf92-57cfc0a37076-kube-api-access-6p5pk\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.334842 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-config\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.334888 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-combined-ca-bundle\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.341006 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-combined-ca-bundle\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.350065 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-config\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.356305 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p5pk\" (UniqueName: \"kubernetes.io/projected/b458ddfb-e4b7-4348-bf92-57cfc0a37076-kube-api-access-6p5pk\") pod \"neutron-db-sync-qgnk9\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.405438 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:46 crc kubenswrapper[4732]: I1010 08:19:46.896538 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qgnk9"] Oct 10 08:19:47 crc kubenswrapper[4732]: I1010 08:19:47.683433 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qgnk9" event={"ID":"b458ddfb-e4b7-4348-bf92-57cfc0a37076","Type":"ContainerStarted","Data":"98e77f1ca7410be7322702551a539b55acb07180737af5cf28ffd0a5b9abe43b"} Oct 10 08:19:47 crc kubenswrapper[4732]: I1010 08:19:47.683716 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qgnk9" event={"ID":"b458ddfb-e4b7-4348-bf92-57cfc0a37076","Type":"ContainerStarted","Data":"1e98bc5ddb50785954e78a6f27c5f70352970e53410083ca4286b4baf31231ff"} Oct 10 08:19:47 crc kubenswrapper[4732]: I1010 08:19:47.689613 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qgnk9" podStartSLOduration=1.689581318 podStartE2EDuration="1.689581318s" podCreationTimestamp="2025-10-10 08:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:19:47.686108784 +0000 UTC m=+5314.755700065" watchObservedRunningTime="2025-10-10 08:19:47.689581318 +0000 UTC m=+5314.759172599" Oct 10 08:19:51 crc kubenswrapper[4732]: I1010 08:19:51.714455 4732 generic.go:334] "Generic (PLEG): container finished" podID="b458ddfb-e4b7-4348-bf92-57cfc0a37076" containerID="98e77f1ca7410be7322702551a539b55acb07180737af5cf28ffd0a5b9abe43b" exitCode=0 Oct 10 08:19:51 crc kubenswrapper[4732]: I1010 08:19:51.714587 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qgnk9" event={"ID":"b458ddfb-e4b7-4348-bf92-57cfc0a37076","Type":"ContainerDied","Data":"98e77f1ca7410be7322702551a539b55acb07180737af5cf28ffd0a5b9abe43b"} Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.057853 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.157347 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-config\") pod \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.157577 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p5pk\" (UniqueName: \"kubernetes.io/projected/b458ddfb-e4b7-4348-bf92-57cfc0a37076-kube-api-access-6p5pk\") pod \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.157620 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-combined-ca-bundle\") pod \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\" (UID: \"b458ddfb-e4b7-4348-bf92-57cfc0a37076\") " Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.163017 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b458ddfb-e4b7-4348-bf92-57cfc0a37076-kube-api-access-6p5pk" (OuterVolumeSpecName: "kube-api-access-6p5pk") pod "b458ddfb-e4b7-4348-bf92-57cfc0a37076" (UID: "b458ddfb-e4b7-4348-bf92-57cfc0a37076"). InnerVolumeSpecName "kube-api-access-6p5pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.185190 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-config" (OuterVolumeSpecName: "config") pod "b458ddfb-e4b7-4348-bf92-57cfc0a37076" (UID: "b458ddfb-e4b7-4348-bf92-57cfc0a37076"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.191863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b458ddfb-e4b7-4348-bf92-57cfc0a37076" (UID: "b458ddfb-e4b7-4348-bf92-57cfc0a37076"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.259541 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.259593 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p5pk\" (UniqueName: \"kubernetes.io/projected/b458ddfb-e4b7-4348-bf92-57cfc0a37076-kube-api-access-6p5pk\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.259616 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b458ddfb-e4b7-4348-bf92-57cfc0a37076-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.738182 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qgnk9" event={"ID":"b458ddfb-e4b7-4348-bf92-57cfc0a37076","Type":"ContainerDied","Data":"1e98bc5ddb50785954e78a6f27c5f70352970e53410083ca4286b4baf31231ff"} Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.738228 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e98bc5ddb50785954e78a6f27c5f70352970e53410083ca4286b4baf31231ff" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.738250 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qgnk9" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.868104 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-569996dfb5-2tpxw"] Oct 10 08:19:53 crc kubenswrapper[4732]: E1010 08:19:53.868498 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b458ddfb-e4b7-4348-bf92-57cfc0a37076" containerName="neutron-db-sync" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.868518 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b458ddfb-e4b7-4348-bf92-57cfc0a37076" containerName="neutron-db-sync" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.868771 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b458ddfb-e4b7-4348-bf92-57cfc0a37076" containerName="neutron-db-sync" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.872027 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.880472 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-569996dfb5-2tpxw"] Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.971772 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-nb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.971823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6zb\" (UniqueName: \"kubernetes.io/projected/4d241f11-84b2-4b97-bcfa-3e0966513fbe-kube-api-access-ss6zb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.971912 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-sb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.972156 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-config\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:53 crc kubenswrapper[4732]: I1010 08:19:53.972289 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-dns-svc\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.073470 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-dns-svc\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.073534 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-nb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.073554 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6zb\" (UniqueName: \"kubernetes.io/projected/4d241f11-84b2-4b97-bcfa-3e0966513fbe-kube-api-access-ss6zb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.073599 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-sb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.073658 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-config\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.074538 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-config\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.075175 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-dns-svc\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.075747 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-nb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.076481 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-sb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.097525 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6zb\" (UniqueName: \"kubernetes.io/projected/4d241f11-84b2-4b97-bcfa-3e0966513fbe-kube-api-access-ss6zb\") pod \"dnsmasq-dns-569996dfb5-2tpxw\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.128142 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c5564f4cb-42cmq"] Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.129655 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.131722 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.132020 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-q8hww" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.132179 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.132288 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.140560 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c5564f4cb-42cmq"] Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.175289 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-httpd-config\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.175329 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqwgn\" (UniqueName: \"kubernetes.io/projected/bee65779-395e-4f18-aa0c-e2ff5fa138db-kube-api-access-qqwgn\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.175367 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-config\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.175421 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-ovndb-tls-certs\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.175449 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-combined-ca-bundle\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.191832 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.276730 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-ovndb-tls-certs\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.276794 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-combined-ca-bundle\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.276858 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-httpd-config\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.276876 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqwgn\" (UniqueName: \"kubernetes.io/projected/bee65779-395e-4f18-aa0c-e2ff5fa138db-kube-api-access-qqwgn\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.276909 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-config\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.283354 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-httpd-config\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.284101 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-config\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.287495 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-combined-ca-bundle\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.292392 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-ovndb-tls-certs\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.307430 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqwgn\" (UniqueName: \"kubernetes.io/projected/bee65779-395e-4f18-aa0c-e2ff5fa138db-kube-api-access-qqwgn\") pod \"neutron-7c5564f4cb-42cmq\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.454347 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.652605 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-569996dfb5-2tpxw"] Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.748013 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" event={"ID":"4d241f11-84b2-4b97-bcfa-3e0966513fbe","Type":"ContainerStarted","Data":"f7d6cb39dc46249956bf8e7caa228918abb85be83b48853fc17ff5531c96f512"} Oct 10 08:19:54 crc kubenswrapper[4732]: I1010 08:19:54.954885 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c5564f4cb-42cmq"] Oct 10 08:19:55 crc kubenswrapper[4732]: I1010 08:19:55.758449 4732 generic.go:334] "Generic (PLEG): container finished" podID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerID="69d2ee98b2dc0f3b3cd590b27bf4246fa79d155b5044415bdf1bc8363fea1ccc" exitCode=0 Oct 10 08:19:55 crc kubenswrapper[4732]: I1010 08:19:55.758493 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" event={"ID":"4d241f11-84b2-4b97-bcfa-3e0966513fbe","Type":"ContainerDied","Data":"69d2ee98b2dc0f3b3cd590b27bf4246fa79d155b5044415bdf1bc8363fea1ccc"} Oct 10 08:19:55 crc kubenswrapper[4732]: I1010 08:19:55.804024 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5564f4cb-42cmq" event={"ID":"bee65779-395e-4f18-aa0c-e2ff5fa138db","Type":"ContainerStarted","Data":"ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8"} Oct 10 08:19:55 crc kubenswrapper[4732]: I1010 08:19:55.804079 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5564f4cb-42cmq" event={"ID":"bee65779-395e-4f18-aa0c-e2ff5fa138db","Type":"ContainerStarted","Data":"5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969"} Oct 10 08:19:55 crc kubenswrapper[4732]: I1010 08:19:55.804091 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5564f4cb-42cmq" event={"ID":"bee65779-395e-4f18-aa0c-e2ff5fa138db","Type":"ContainerStarted","Data":"75473cab4e9c6f42b83e7bbdef25d8a9f80916bbd4ea68360ef6280c1378b04e"} Oct 10 08:19:55 crc kubenswrapper[4732]: I1010 08:19:55.804986 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:19:55 crc kubenswrapper[4732]: I1010 08:19:55.873849 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c5564f4cb-42cmq" podStartSLOduration=1.873826582 podStartE2EDuration="1.873826582s" podCreationTimestamp="2025-10-10 08:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:19:55.850392449 +0000 UTC m=+5322.919983700" watchObservedRunningTime="2025-10-10 08:19:55.873826582 +0000 UTC m=+5322.943417823" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.269154 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-57c9989b5f-7clkk"] Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.271311 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.274063 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.274281 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.297357 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57c9989b5f-7clkk"] Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.317255 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbzpx\" (UniqueName: \"kubernetes.io/projected/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-kube-api-access-mbzpx\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.317358 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-combined-ca-bundle\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.317401 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-internal-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.317429 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-config\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.317454 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-ovndb-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.317483 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-httpd-config\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.317549 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-public-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.418587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-config\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.418628 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-ovndb-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.418655 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-httpd-config\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.418724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-public-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.418768 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbzpx\" (UniqueName: \"kubernetes.io/projected/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-kube-api-access-mbzpx\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.418816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-combined-ca-bundle\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.418842 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-internal-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.424107 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-combined-ca-bundle\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.424134 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-config\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.425125 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-httpd-config\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.425277 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-public-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.425305 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-ovndb-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.429642 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-internal-tls-certs\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.438301 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbzpx\" (UniqueName: \"kubernetes.io/projected/aa03b38d-f0b0-4556-b71d-1abc28d2eb82-kube-api-access-mbzpx\") pod \"neutron-57c9989b5f-7clkk\" (UID: \"aa03b38d-f0b0-4556-b71d-1abc28d2eb82\") " pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.592015 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.818464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" event={"ID":"4d241f11-84b2-4b97-bcfa-3e0966513fbe","Type":"ContainerStarted","Data":"18ed10fd5fbde4ee2ef4a49ecef064e7eb77f3ddb82f4f5968063cdf460d23b0"} Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.818841 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:19:56 crc kubenswrapper[4732]: I1010 08:19:56.841601 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" podStartSLOduration=3.841577698 podStartE2EDuration="3.841577698s" podCreationTimestamp="2025-10-10 08:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:19:56.836224403 +0000 UTC m=+5323.905815664" watchObservedRunningTime="2025-10-10 08:19:56.841577698 +0000 UTC m=+5323.911168959" Oct 10 08:19:57 crc kubenswrapper[4732]: I1010 08:19:57.188183 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-57c9989b5f-7clkk"] Oct 10 08:19:57 crc kubenswrapper[4732]: I1010 08:19:57.847462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c9989b5f-7clkk" event={"ID":"aa03b38d-f0b0-4556-b71d-1abc28d2eb82","Type":"ContainerStarted","Data":"3c07f9d944414f68815f0a73d8915d68ac0dd0eb06337f833bdef3ee4300f62b"} Oct 10 08:19:57 crc kubenswrapper[4732]: I1010 08:19:57.847824 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c9989b5f-7clkk" event={"ID":"aa03b38d-f0b0-4556-b71d-1abc28d2eb82","Type":"ContainerStarted","Data":"9124198677cde519865ee77e940930d2952be54207e17dd032438a0e4fcfa349"} Oct 10 08:19:57 crc kubenswrapper[4732]: I1010 08:19:57.847839 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-57c9989b5f-7clkk" event={"ID":"aa03b38d-f0b0-4556-b71d-1abc28d2eb82","Type":"ContainerStarted","Data":"d8d4d935caa5673424423bb09c3c948774b685f51a1d38882f979e4345b0ecc9"} Oct 10 08:19:57 crc kubenswrapper[4732]: I1010 08:19:57.848595 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:19:57 crc kubenswrapper[4732]: I1010 08:19:57.876965 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-57c9989b5f-7clkk" podStartSLOduration=1.876943759 podStartE2EDuration="1.876943759s" podCreationTimestamp="2025-10-10 08:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:19:57.867111963 +0000 UTC m=+5324.936703204" watchObservedRunningTime="2025-10-10 08:19:57.876943759 +0000 UTC m=+5324.946535000" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.194000 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.274303 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6888c7f469-lf5dv"] Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.274661 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" podUID="113b111b-bca6-49ea-9c64-7a903abecef8" containerName="dnsmasq-dns" containerID="cri-o://01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029" gracePeriod=10 Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.761508 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.899234 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-dns-svc\") pod \"113b111b-bca6-49ea-9c64-7a903abecef8\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.899381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-sb\") pod \"113b111b-bca6-49ea-9c64-7a903abecef8\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.900148 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-config\") pod \"113b111b-bca6-49ea-9c64-7a903abecef8\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.900967 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lfh7\" (UniqueName: \"kubernetes.io/projected/113b111b-bca6-49ea-9c64-7a903abecef8-kube-api-access-2lfh7\") pod \"113b111b-bca6-49ea-9c64-7a903abecef8\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.901017 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-nb\") pod \"113b111b-bca6-49ea-9c64-7a903abecef8\" (UID: \"113b111b-bca6-49ea-9c64-7a903abecef8\") " Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.904744 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113b111b-bca6-49ea-9c64-7a903abecef8-kube-api-access-2lfh7" (OuterVolumeSpecName: "kube-api-access-2lfh7") pod "113b111b-bca6-49ea-9c64-7a903abecef8" (UID: "113b111b-bca6-49ea-9c64-7a903abecef8"). InnerVolumeSpecName "kube-api-access-2lfh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.915883 4732 generic.go:334] "Generic (PLEG): container finished" podID="113b111b-bca6-49ea-9c64-7a903abecef8" containerID="01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029" exitCode=0 Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.915934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" event={"ID":"113b111b-bca6-49ea-9c64-7a903abecef8","Type":"ContainerDied","Data":"01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029"} Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.915967 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" event={"ID":"113b111b-bca6-49ea-9c64-7a903abecef8","Type":"ContainerDied","Data":"9cc47f48f0909ffce09e6e90eb19057cadb8d3e5dd7f0bb8e2be4f00d79d9da7"} Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.915987 4732 scope.go:117] "RemoveContainer" containerID="01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.916154 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6888c7f469-lf5dv" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.945127 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "113b111b-bca6-49ea-9c64-7a903abecef8" (UID: "113b111b-bca6-49ea-9c64-7a903abecef8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.947935 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-config" (OuterVolumeSpecName: "config") pod "113b111b-bca6-49ea-9c64-7a903abecef8" (UID: "113b111b-bca6-49ea-9c64-7a903abecef8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.953735 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "113b111b-bca6-49ea-9c64-7a903abecef8" (UID: "113b111b-bca6-49ea-9c64-7a903abecef8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:04 crc kubenswrapper[4732]: I1010 08:20:04.956677 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "113b111b-bca6-49ea-9c64-7a903abecef8" (UID: "113b111b-bca6-49ea-9c64-7a903abecef8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.004421 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.004646 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lfh7\" (UniqueName: \"kubernetes.io/projected/113b111b-bca6-49ea-9c64-7a903abecef8-kube-api-access-2lfh7\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.004793 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.004855 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.004910 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b111b-bca6-49ea-9c64-7a903abecef8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.011168 4732 scope.go:117] "RemoveContainer" containerID="1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.031262 4732 scope.go:117] "RemoveContainer" containerID="01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029" Oct 10 08:20:05 crc kubenswrapper[4732]: E1010 08:20:05.031945 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029\": container with ID starting with 01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029 not found: ID does not exist" containerID="01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.031981 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029"} err="failed to get container status \"01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029\": rpc error: code = NotFound desc = could not find container \"01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029\": container with ID starting with 01869df3cbe9e9db50136b79db503f64e2ebd20f8315a76074b9651d0be57029 not found: ID does not exist" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.032004 4732 scope.go:117] "RemoveContainer" containerID="1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b" Oct 10 08:20:05 crc kubenswrapper[4732]: E1010 08:20:05.032735 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b\": container with ID starting with 1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b not found: ID does not exist" containerID="1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.032769 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b"} err="failed to get container status \"1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b\": rpc error: code = NotFound desc = could not find container \"1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b\": container with ID starting with 1ae3ac4e0c6af5f2a19d54d2f2c0a0863b1dd8c3a8ebe35fbabd0192e4d8fa8b not found: ID does not exist" Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.247892 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6888c7f469-lf5dv"] Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.253444 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6888c7f469-lf5dv"] Oct 10 08:20:05 crc kubenswrapper[4732]: I1010 08:20:05.674017 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113b111b-bca6-49ea-9c64-7a903abecef8" path="/var/lib/kubelet/pods/113b111b-bca6-49ea-9c64-7a903abecef8/volumes" Oct 10 08:20:24 crc kubenswrapper[4732]: I1010 08:20:24.461989 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:20:25 crc kubenswrapper[4732]: I1010 08:20:25.851425 4732 scope.go:117] "RemoveContainer" containerID="4e77e523ea519001cb9b7fac2d5e128f39ee1f7fcd373c6018df6abf29d9821c" Oct 10 08:20:25 crc kubenswrapper[4732]: I1010 08:20:25.874421 4732 scope.go:117] "RemoveContainer" containerID="4ef2ee8a6a9bdc571ac397badaea08db591968935e4dd3dbce15a9b1ea20f500" Oct 10 08:20:25 crc kubenswrapper[4732]: I1010 08:20:25.917055 4732 scope.go:117] "RemoveContainer" containerID="bb0bebf7c7e500a642da1b279f14f0e83c07dc42c479c5680e2d4d35c9b92026" Oct 10 08:20:25 crc kubenswrapper[4732]: I1010 08:20:25.965031 4732 scope.go:117] "RemoveContainer" containerID="307e51bedfc6323b4b22d27d0ca5a9b35b5fb59e454d2e92e1b76a992141177b" Oct 10 08:20:26 crc kubenswrapper[4732]: I1010 08:20:26.609241 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-57c9989b5f-7clkk" Oct 10 08:20:26 crc kubenswrapper[4732]: I1010 08:20:26.678920 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c5564f4cb-42cmq"] Oct 10 08:20:26 crc kubenswrapper[4732]: I1010 08:20:26.679293 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c5564f4cb-42cmq" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-api" containerID="cri-o://5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969" gracePeriod=30 Oct 10 08:20:26 crc kubenswrapper[4732]: I1010 08:20:26.679488 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c5564f4cb-42cmq" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-httpd" containerID="cri-o://ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8" gracePeriod=30 Oct 10 08:20:27 crc kubenswrapper[4732]: I1010 08:20:27.133015 4732 generic.go:334] "Generic (PLEG): container finished" podID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerID="ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8" exitCode=0 Oct 10 08:20:27 crc kubenswrapper[4732]: I1010 08:20:27.133115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5564f4cb-42cmq" event={"ID":"bee65779-395e-4f18-aa0c-e2ff5fa138db","Type":"ContainerDied","Data":"ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8"} Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.781595 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.880100 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqwgn\" (UniqueName: \"kubernetes.io/projected/bee65779-395e-4f18-aa0c-e2ff5fa138db-kube-api-access-qqwgn\") pod \"bee65779-395e-4f18-aa0c-e2ff5fa138db\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.880176 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-ovndb-tls-certs\") pod \"bee65779-395e-4f18-aa0c-e2ff5fa138db\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.880336 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-combined-ca-bundle\") pod \"bee65779-395e-4f18-aa0c-e2ff5fa138db\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.880434 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-config\") pod \"bee65779-395e-4f18-aa0c-e2ff5fa138db\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.880500 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-httpd-config\") pod \"bee65779-395e-4f18-aa0c-e2ff5fa138db\" (UID: \"bee65779-395e-4f18-aa0c-e2ff5fa138db\") " Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.896111 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bee65779-395e-4f18-aa0c-e2ff5fa138db" (UID: "bee65779-395e-4f18-aa0c-e2ff5fa138db"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.896151 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee65779-395e-4f18-aa0c-e2ff5fa138db-kube-api-access-qqwgn" (OuterVolumeSpecName: "kube-api-access-qqwgn") pod "bee65779-395e-4f18-aa0c-e2ff5fa138db" (UID: "bee65779-395e-4f18-aa0c-e2ff5fa138db"). InnerVolumeSpecName "kube-api-access-qqwgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.928161 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bee65779-395e-4f18-aa0c-e2ff5fa138db" (UID: "bee65779-395e-4f18-aa0c-e2ff5fa138db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.956153 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-config" (OuterVolumeSpecName: "config") pod "bee65779-395e-4f18-aa0c-e2ff5fa138db" (UID: "bee65779-395e-4f18-aa0c-e2ff5fa138db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.958494 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bee65779-395e-4f18-aa0c-e2ff5fa138db" (UID: "bee65779-395e-4f18-aa0c-e2ff5fa138db"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.983044 4732 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.983081 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.983096 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.983110 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bee65779-395e-4f18-aa0c-e2ff5fa138db-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:29 crc kubenswrapper[4732]: I1010 08:20:29.983124 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqwgn\" (UniqueName: \"kubernetes.io/projected/bee65779-395e-4f18-aa0c-e2ff5fa138db-kube-api-access-qqwgn\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.176409 4732 generic.go:334] "Generic (PLEG): container finished" podID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerID="5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969" exitCode=0 Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.176537 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c5564f4cb-42cmq" Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.176561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5564f4cb-42cmq" event={"ID":"bee65779-395e-4f18-aa0c-e2ff5fa138db","Type":"ContainerDied","Data":"5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969"} Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.178975 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c5564f4cb-42cmq" event={"ID":"bee65779-395e-4f18-aa0c-e2ff5fa138db","Type":"ContainerDied","Data":"75473cab4e9c6f42b83e7bbdef25d8a9f80916bbd4ea68360ef6280c1378b04e"} Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.179043 4732 scope.go:117] "RemoveContainer" containerID="ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8" Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.221490 4732 scope.go:117] "RemoveContainer" containerID="5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969" Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.224879 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c5564f4cb-42cmq"] Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.232416 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c5564f4cb-42cmq"] Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.243648 4732 scope.go:117] "RemoveContainer" containerID="ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8" Oct 10 08:20:30 crc kubenswrapper[4732]: E1010 08:20:30.244197 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8\": container with ID starting with ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8 not found: ID does not exist" containerID="ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8" Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.244237 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8"} err="failed to get container status \"ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8\": rpc error: code = NotFound desc = could not find container \"ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8\": container with ID starting with ffac0d5d0d912c2f08b60eb5a502571a74a1ae647689a0296f895c70c8ae1af8 not found: ID does not exist" Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.244267 4732 scope.go:117] "RemoveContainer" containerID="5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969" Oct 10 08:20:30 crc kubenswrapper[4732]: E1010 08:20:30.244652 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969\": container with ID starting with 5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969 not found: ID does not exist" containerID="5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969" Oct 10 08:20:30 crc kubenswrapper[4732]: I1010 08:20:30.244718 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969"} err="failed to get container status \"5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969\": rpc error: code = NotFound desc = could not find container \"5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969\": container with ID starting with 5b78e1e604a132b3143efeb29258a413ecfa20274a2b651c919d29e28cacb969 not found: ID does not exist" Oct 10 08:20:31 crc kubenswrapper[4732]: I1010 08:20:31.671383 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" path="/var/lib/kubelet/pods/bee65779-395e-4f18-aa0c-e2ff5fa138db/volumes" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.537832 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2cj82"] Oct 10 08:20:35 crc kubenswrapper[4732]: E1010 08:20:35.538566 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113b111b-bca6-49ea-9c64-7a903abecef8" containerName="init" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.538584 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="113b111b-bca6-49ea-9c64-7a903abecef8" containerName="init" Oct 10 08:20:35 crc kubenswrapper[4732]: E1010 08:20:35.538630 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113b111b-bca6-49ea-9c64-7a903abecef8" containerName="dnsmasq-dns" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.538638 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="113b111b-bca6-49ea-9c64-7a903abecef8" containerName="dnsmasq-dns" Oct 10 08:20:35 crc kubenswrapper[4732]: E1010 08:20:35.538662 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-api" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.538670 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-api" Oct 10 08:20:35 crc kubenswrapper[4732]: E1010 08:20:35.538678 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-httpd" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.538686 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-httpd" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.538923 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="113b111b-bca6-49ea-9c64-7a903abecef8" containerName="dnsmasq-dns" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.538936 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-api" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.538960 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee65779-395e-4f18-aa0c-e2ff5fa138db" containerName="neutron-httpd" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.539727 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.548240 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.548319 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-srk94" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.548319 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.548647 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.553249 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.623621 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-swk7w"] Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.625389 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.646228 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2cj82"] Oct 10 08:20:35 crc kubenswrapper[4732]: E1010 08:20:35.650263 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-s7wqm ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-2cj82" podUID="2a70cd07-d890-4611-bdef-d87e89c4ee76" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.688768 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-swk7w"] Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.701992 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-combined-ca-bundle\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702042 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d08ea64-6030-4467-964d-f85c284a1a1b-etc-swift\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702077 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-scripts\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-combined-ca-bundle\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702165 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-scripts\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-swiftconf\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-swiftconf\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702238 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9w4q\" (UniqueName: \"kubernetes.io/projected/1d08ea64-6030-4467-964d-f85c284a1a1b-kube-api-access-j9w4q\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702266 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-ring-data-devices\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wqm\" (UniqueName: \"kubernetes.io/projected/2a70cd07-d890-4611-bdef-d87e89c4ee76-kube-api-access-s7wqm\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702346 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a70cd07-d890-4611-bdef-d87e89c4ee76-etc-swift\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702370 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-dispersionconf\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702407 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-ring-data-devices\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.702428 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-dispersionconf\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.707415 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2cj82"] Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.748778 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c48b4dd5-8xlrw"] Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.750586 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.765638 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c48b4dd5-8xlrw"] Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.803654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9w4q\" (UniqueName: \"kubernetes.io/projected/1d08ea64-6030-4467-964d-f85c284a1a1b-kube-api-access-j9w4q\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804002 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-ring-data-devices\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804051 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wqm\" (UniqueName: \"kubernetes.io/projected/2a70cd07-d890-4611-bdef-d87e89c4ee76-kube-api-access-s7wqm\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804156 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a70cd07-d890-4611-bdef-d87e89c4ee76-etc-swift\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804185 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-dispersionconf\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-ring-data-devices\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804268 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-dispersionconf\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804356 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-combined-ca-bundle\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804390 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d08ea64-6030-4467-964d-f85c284a1a1b-etc-swift\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804428 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-scripts\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804526 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-combined-ca-bundle\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804589 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-scripts\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-swiftconf\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.804635 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-swiftconf\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.805063 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d08ea64-6030-4467-964d-f85c284a1a1b-etc-swift\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.805122 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-ring-data-devices\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.805572 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-scripts\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.806682 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-scripts\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.806806 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a70cd07-d890-4611-bdef-d87e89c4ee76-etc-swift\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.810460 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-swiftconf\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.813378 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-combined-ca-bundle\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.818737 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-dispersionconf\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.824314 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wqm\" (UniqueName: \"kubernetes.io/projected/2a70cd07-d890-4611-bdef-d87e89c4ee76-kube-api-access-s7wqm\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.827393 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-ring-data-devices\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.827943 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9w4q\" (UniqueName: \"kubernetes.io/projected/1d08ea64-6030-4467-964d-f85c284a1a1b-kube-api-access-j9w4q\") pod \"swift-ring-rebalance-swk7w\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.828825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-combined-ca-bundle\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.832866 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-swiftconf\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.844786 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-dispersionconf\") pod \"swift-ring-rebalance-2cj82\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.906187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.906275 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4p7t\" (UniqueName: \"kubernetes.io/projected/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-kube-api-access-x4p7t\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.906304 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-dns-svc\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.906323 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-config\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.906387 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:35 crc kubenswrapper[4732]: I1010 08:20:35.961419 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.008437 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4p7t\" (UniqueName: \"kubernetes.io/projected/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-kube-api-access-x4p7t\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.008482 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-dns-svc\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.008505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-config\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.008551 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.008620 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.009773 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-dns-svc\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.009877 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-config\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.009975 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.010338 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.028441 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4p7t\" (UniqueName: \"kubernetes.io/projected/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-kube-api-access-x4p7t\") pod \"dnsmasq-dns-6c48b4dd5-8xlrw\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.087711 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.250673 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.258058 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-swk7w"] Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.264313 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.284352 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.317651 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-ring-data-devices\") pod \"2a70cd07-d890-4611-bdef-d87e89c4ee76\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.317771 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wqm\" (UniqueName: \"kubernetes.io/projected/2a70cd07-d890-4611-bdef-d87e89c4ee76-kube-api-access-s7wqm\") pod \"2a70cd07-d890-4611-bdef-d87e89c4ee76\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.317903 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-combined-ca-bundle\") pod \"2a70cd07-d890-4611-bdef-d87e89c4ee76\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.317992 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a70cd07-d890-4611-bdef-d87e89c4ee76-etc-swift\") pod \"2a70cd07-d890-4611-bdef-d87e89c4ee76\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.318077 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-scripts\") pod \"2a70cd07-d890-4611-bdef-d87e89c4ee76\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.318103 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-dispersionconf\") pod \"2a70cd07-d890-4611-bdef-d87e89c4ee76\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.318135 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-swiftconf\") pod \"2a70cd07-d890-4611-bdef-d87e89c4ee76\" (UID: \"2a70cd07-d890-4611-bdef-d87e89c4ee76\") " Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.318392 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2a70cd07-d890-4611-bdef-d87e89c4ee76" (UID: "2a70cd07-d890-4611-bdef-d87e89c4ee76"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.318648 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.318673 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a70cd07-d890-4611-bdef-d87e89c4ee76-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2a70cd07-d890-4611-bdef-d87e89c4ee76" (UID: "2a70cd07-d890-4611-bdef-d87e89c4ee76"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.319267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-scripts" (OuterVolumeSpecName: "scripts") pod "2a70cd07-d890-4611-bdef-d87e89c4ee76" (UID: "2a70cd07-d890-4611-bdef-d87e89c4ee76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.322873 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2a70cd07-d890-4611-bdef-d87e89c4ee76" (UID: "2a70cd07-d890-4611-bdef-d87e89c4ee76"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.325529 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2a70cd07-d890-4611-bdef-d87e89c4ee76" (UID: "2a70cd07-d890-4611-bdef-d87e89c4ee76"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.327906 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a70cd07-d890-4611-bdef-d87e89c4ee76" (UID: "2a70cd07-d890-4611-bdef-d87e89c4ee76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.330222 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a70cd07-d890-4611-bdef-d87e89c4ee76-kube-api-access-s7wqm" (OuterVolumeSpecName: "kube-api-access-s7wqm") pod "2a70cd07-d890-4611-bdef-d87e89c4ee76" (UID: "2a70cd07-d890-4611-bdef-d87e89c4ee76"). InnerVolumeSpecName "kube-api-access-s7wqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.422330 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2a70cd07-d890-4611-bdef-d87e89c4ee76-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.422370 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2a70cd07-d890-4611-bdef-d87e89c4ee76-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.422382 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.422397 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.422410 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wqm\" (UniqueName: \"kubernetes.io/projected/2a70cd07-d890-4611-bdef-d87e89c4ee76-kube-api-access-s7wqm\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.422421 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a70cd07-d890-4611-bdef-d87e89c4ee76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:36 crc kubenswrapper[4732]: I1010 08:20:36.557151 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c48b4dd5-8xlrw"] Oct 10 08:20:36 crc kubenswrapper[4732]: W1010 08:20:36.561280 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd54d4b7e_2314_4610_bfe8_8faa5f0406d8.slice/crio-249304babd57c03cf8da4f09e8bcfbe2245369f62aed3882d6be3ab7ab8be21a WatchSource:0}: Error finding container 249304babd57c03cf8da4f09e8bcfbe2245369f62aed3882d6be3ab7ab8be21a: Status 404 returned error can't find the container with id 249304babd57c03cf8da4f09e8bcfbe2245369f62aed3882d6be3ab7ab8be21a Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.276136 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-swk7w" event={"ID":"1d08ea64-6030-4467-964d-f85c284a1a1b","Type":"ContainerStarted","Data":"bd57f1685a85dee58c7dcad62ac0b2d7e758a6bfb6336083e560c141fd93cfda"} Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.278365 4732 generic.go:334] "Generic (PLEG): container finished" podID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerID="a910e9de8a5fd88e36b61485dd740b0c45380d2f7ee7f977b7a0778bced7eacc" exitCode=0 Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.278418 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2cj82" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.279469 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" event={"ID":"d54d4b7e-2314-4610-bfe8-8faa5f0406d8","Type":"ContainerDied","Data":"a910e9de8a5fd88e36b61485dd740b0c45380d2f7ee7f977b7a0778bced7eacc"} Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.279582 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" event={"ID":"d54d4b7e-2314-4610-bfe8-8faa5f0406d8","Type":"ContainerStarted","Data":"249304babd57c03cf8da4f09e8bcfbe2245369f62aed3882d6be3ab7ab8be21a"} Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.352554 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-2cj82"] Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.357662 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-2cj82"] Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.670189 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a70cd07-d890-4611-bdef-d87e89c4ee76" path="/var/lib/kubelet/pods/2a70cd07-d890-4611-bdef-d87e89c4ee76/volumes" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.883795 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7776658547-4fgmc"] Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.885210 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.897936 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.908642 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7776658547-4fgmc"] Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.960346 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-combined-ca-bundle\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.960671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-etc-swift\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.960737 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-config-data\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.960770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-log-httpd\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.960933 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sq57\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-kube-api-access-5sq57\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:37 crc kubenswrapper[4732]: I1010 08:20:37.961066 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-run-httpd\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.063330 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sq57\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-kube-api-access-5sq57\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.063441 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-run-httpd\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.063522 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-combined-ca-bundle\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.063609 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-etc-swift\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.063688 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-config-data\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.063764 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-log-httpd\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.064370 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-log-httpd\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.064381 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-run-httpd\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.072878 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-etc-swift\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.082791 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-config-data\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.086780 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sq57\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-kube-api-access-5sq57\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.087778 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-combined-ca-bundle\") pod \"swift-proxy-7776658547-4fgmc\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:38 crc kubenswrapper[4732]: I1010 08:20:38.257478 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.038054 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5764484767-24vlx"] Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.039742 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.078131 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.078168 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.092822 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5764484767-24vlx"] Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.195901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-internal-tls-certs\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.196229 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a653e702-8013-4a30-b236-a5496d1b29e8-run-httpd\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.196257 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a653e702-8013-4a30-b236-a5496d1b29e8-etc-swift\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.196283 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-config-data\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.196318 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-public-tls-certs\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.196349 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-combined-ca-bundle\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.196371 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn6ph\" (UniqueName: \"kubernetes.io/projected/a653e702-8013-4a30-b236-a5496d1b29e8-kube-api-access-dn6ph\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.196481 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a653e702-8013-4a30-b236-a5496d1b29e8-log-httpd\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.217785 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7776658547-4fgmc"] Oct 10 08:20:40 crc kubenswrapper[4732]: W1010 08:20:40.230663 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4556001e_ee45_4c64_84a0_317b90d1583e.slice/crio-d3e5edf993ea4575b11b90639a3a066193321235595d746c37118fd359286b8f WatchSource:0}: Error finding container d3e5edf993ea4575b11b90639a3a066193321235595d746c37118fd359286b8f: Status 404 returned error can't find the container with id d3e5edf993ea4575b11b90639a3a066193321235595d746c37118fd359286b8f Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303101 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-public-tls-certs\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303161 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-combined-ca-bundle\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303183 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn6ph\" (UniqueName: \"kubernetes.io/projected/a653e702-8013-4a30-b236-a5496d1b29e8-kube-api-access-dn6ph\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303237 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a653e702-8013-4a30-b236-a5496d1b29e8-log-httpd\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-internal-tls-certs\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a653e702-8013-4a30-b236-a5496d1b29e8-run-httpd\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a653e702-8013-4a30-b236-a5496d1b29e8-etc-swift\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.303663 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-config-data\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.304658 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a653e702-8013-4a30-b236-a5496d1b29e8-log-httpd\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.304764 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a653e702-8013-4a30-b236-a5496d1b29e8-run-httpd\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.305832 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776658547-4fgmc" event={"ID":"4556001e-ee45-4c64-84a0-317b90d1583e","Type":"ContainerStarted","Data":"d3e5edf993ea4575b11b90639a3a066193321235595d746c37118fd359286b8f"} Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.307376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-swk7w" event={"ID":"1d08ea64-6030-4467-964d-f85c284a1a1b","Type":"ContainerStarted","Data":"98c30105d88b04f2fb47dccddcbaf124f8d0aa4966b40d4fbe40e913529540bd"} Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.308009 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-combined-ca-bundle\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.309000 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-config-data\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.309377 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a653e702-8013-4a30-b236-a5496d1b29e8-etc-swift\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.310582 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-internal-tls-certs\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.311023 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" event={"ID":"d54d4b7e-2314-4610-bfe8-8faa5f0406d8","Type":"ContainerStarted","Data":"b12c3691d0cdba7073f7a1a1f7d9d957d30652b02a5c306783cef110128e1889"} Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.311175 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.312297 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a653e702-8013-4a30-b236-a5496d1b29e8-public-tls-certs\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.322127 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn6ph\" (UniqueName: \"kubernetes.io/projected/a653e702-8013-4a30-b236-a5496d1b29e8-kube-api-access-dn6ph\") pod \"swift-proxy-5764484767-24vlx\" (UID: \"a653e702-8013-4a30-b236-a5496d1b29e8\") " pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.327570 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-swk7w" podStartSLOduration=1.936026614 podStartE2EDuration="5.327555757s" podCreationTimestamp="2025-10-10 08:20:35 +0000 UTC" firstStartedPulling="2025-10-10 08:20:36.284122108 +0000 UTC m=+5363.353713349" lastFinishedPulling="2025-10-10 08:20:39.675651251 +0000 UTC m=+5366.745242492" observedRunningTime="2025-10-10 08:20:40.321135124 +0000 UTC m=+5367.390726375" watchObservedRunningTime="2025-10-10 08:20:40.327555757 +0000 UTC m=+5367.397146998" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.347378 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" podStartSLOduration=5.347357952 podStartE2EDuration="5.347357952s" podCreationTimestamp="2025-10-10 08:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:20:40.341803512 +0000 UTC m=+5367.411394753" watchObservedRunningTime="2025-10-10 08:20:40.347357952 +0000 UTC m=+5367.416949193" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.393707 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:40 crc kubenswrapper[4732]: I1010 08:20:40.912521 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5764484767-24vlx"] Oct 10 08:20:40 crc kubenswrapper[4732]: W1010 08:20:40.913683 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda653e702_8013_4a30_b236_a5496d1b29e8.slice/crio-17d378ab009e47b723942cb0aaeaf68949eac68ea50f4c858a748fc33bfad6f7 WatchSource:0}: Error finding container 17d378ab009e47b723942cb0aaeaf68949eac68ea50f4c858a748fc33bfad6f7: Status 404 returned error can't find the container with id 17d378ab009e47b723942cb0aaeaf68949eac68ea50f4c858a748fc33bfad6f7 Oct 10 08:20:41 crc kubenswrapper[4732]: I1010 08:20:41.321234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5764484767-24vlx" event={"ID":"a653e702-8013-4a30-b236-a5496d1b29e8","Type":"ContainerStarted","Data":"fd1fd2939b72ea3fd7a02c40ad0ce105888e379d715aa11706d7118283f3302e"} Oct 10 08:20:41 crc kubenswrapper[4732]: I1010 08:20:41.323092 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5764484767-24vlx" event={"ID":"a653e702-8013-4a30-b236-a5496d1b29e8","Type":"ContainerStarted","Data":"17d378ab009e47b723942cb0aaeaf68949eac68ea50f4c858a748fc33bfad6f7"} Oct 10 08:20:41 crc kubenswrapper[4732]: I1010 08:20:41.323711 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776658547-4fgmc" event={"ID":"4556001e-ee45-4c64-84a0-317b90d1583e","Type":"ContainerStarted","Data":"6fa80bee506149becc656bb66c5b1e7d86bc0c28ba9db04dc011689a38ae3cb5"} Oct 10 08:20:41 crc kubenswrapper[4732]: I1010 08:20:41.323830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776658547-4fgmc" event={"ID":"4556001e-ee45-4c64-84a0-317b90d1583e","Type":"ContainerStarted","Data":"a5d4e40b438a0aa8949423f9e8cb9c14aab33f4d719867b23c63dfa9dc537974"} Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.282300 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7776658547-4fgmc" podStartSLOduration=5.28227761 podStartE2EDuration="5.28227761s" podCreationTimestamp="2025-10-10 08:20:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:20:41.342614841 +0000 UTC m=+5368.412206082" watchObservedRunningTime="2025-10-10 08:20:42.28227761 +0000 UTC m=+5369.351868851" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.306840 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rx4q4"] Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.309061 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.318155 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rx4q4"] Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.416882 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5764484767-24vlx" event={"ID":"a653e702-8013-4a30-b236-a5496d1b29e8","Type":"ContainerStarted","Data":"b11f6c8d3d46132c8b4a5c8cf434ac767c0fcb9e04fcced129e39bb532465d27"} Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.416926 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.416967 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.416987 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.417004 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.450386 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-utilities\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.450529 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-catalog-content\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.450567 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5q5\" (UniqueName: \"kubernetes.io/projected/330e92eb-e631-46de-8b50-c9183d81a284-kube-api-access-6z5q5\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.552442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-catalog-content\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.553119 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-catalog-content\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.553469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5q5\" (UniqueName: \"kubernetes.io/projected/330e92eb-e631-46de-8b50-c9183d81a284-kube-api-access-6z5q5\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.553606 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-utilities\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.553998 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-utilities\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.580943 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5q5\" (UniqueName: \"kubernetes.io/projected/330e92eb-e631-46de-8b50-c9183d81a284-kube-api-access-6z5q5\") pod \"redhat-marketplace-rx4q4\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:42 crc kubenswrapper[4732]: I1010 08:20:42.680470 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:43 crc kubenswrapper[4732]: I1010 08:20:43.119663 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5764484767-24vlx" podStartSLOduration=3.119645418 podStartE2EDuration="3.119645418s" podCreationTimestamp="2025-10-10 08:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:20:42.441929157 +0000 UTC m=+5369.511520408" watchObservedRunningTime="2025-10-10 08:20:43.119645418 +0000 UTC m=+5370.189236659" Oct 10 08:20:43 crc kubenswrapper[4732]: I1010 08:20:43.125616 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rx4q4"] Oct 10 08:20:43 crc kubenswrapper[4732]: W1010 08:20:43.135534 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330e92eb_e631_46de_8b50_c9183d81a284.slice/crio-518930a1342c92b95e28ac8ec569554c8bad855edb8f0ba1a77d759cf0ea96ad WatchSource:0}: Error finding container 518930a1342c92b95e28ac8ec569554c8bad855edb8f0ba1a77d759cf0ea96ad: Status 404 returned error can't find the container with id 518930a1342c92b95e28ac8ec569554c8bad855edb8f0ba1a77d759cf0ea96ad Oct 10 08:20:43 crc kubenswrapper[4732]: I1010 08:20:43.427563 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rx4q4" event={"ID":"330e92eb-e631-46de-8b50-c9183d81a284","Type":"ContainerStarted","Data":"3407d65780a652c9d4adf45e8752349c569186b2fa02ae356dd226bc901a3bbe"} Oct 10 08:20:43 crc kubenswrapper[4732]: I1010 08:20:43.427614 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rx4q4" event={"ID":"330e92eb-e631-46de-8b50-c9183d81a284","Type":"ContainerStarted","Data":"518930a1342c92b95e28ac8ec569554c8bad855edb8f0ba1a77d759cf0ea96ad"} Oct 10 08:20:44 crc kubenswrapper[4732]: I1010 08:20:44.436047 4732 generic.go:334] "Generic (PLEG): container finished" podID="330e92eb-e631-46de-8b50-c9183d81a284" containerID="3407d65780a652c9d4adf45e8752349c569186b2fa02ae356dd226bc901a3bbe" exitCode=0 Oct 10 08:20:44 crc kubenswrapper[4732]: I1010 08:20:44.436315 4732 generic.go:334] "Generic (PLEG): container finished" podID="330e92eb-e631-46de-8b50-c9183d81a284" containerID="6c2d7ebbbb6224f73cc225552fb48fa7224a34348305f0e2c8cce9595095f4a2" exitCode=0 Oct 10 08:20:44 crc kubenswrapper[4732]: I1010 08:20:44.436086 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rx4q4" event={"ID":"330e92eb-e631-46de-8b50-c9183d81a284","Type":"ContainerDied","Data":"3407d65780a652c9d4adf45e8752349c569186b2fa02ae356dd226bc901a3bbe"} Oct 10 08:20:44 crc kubenswrapper[4732]: I1010 08:20:44.436369 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rx4q4" event={"ID":"330e92eb-e631-46de-8b50-c9183d81a284","Type":"ContainerDied","Data":"6c2d7ebbbb6224f73cc225552fb48fa7224a34348305f0e2c8cce9595095f4a2"} Oct 10 08:20:44 crc kubenswrapper[4732]: I1010 08:20:44.438458 4732 generic.go:334] "Generic (PLEG): container finished" podID="1d08ea64-6030-4467-964d-f85c284a1a1b" containerID="98c30105d88b04f2fb47dccddcbaf124f8d0aa4966b40d4fbe40e913529540bd" exitCode=0 Oct 10 08:20:44 crc kubenswrapper[4732]: I1010 08:20:44.438618 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-swk7w" event={"ID":"1d08ea64-6030-4467-964d-f85c284a1a1b","Type":"ContainerDied","Data":"98c30105d88b04f2fb47dccddcbaf124f8d0aa4966b40d4fbe40e913529540bd"} Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.453632 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rx4q4" event={"ID":"330e92eb-e631-46de-8b50-c9183d81a284","Type":"ContainerStarted","Data":"6c01a50ac43d638037de195c4d8954f53c7bf57fd9c8f8cf61211366b022e3b7"} Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.475573 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rx4q4" podStartSLOduration=1.8538356440000001 podStartE2EDuration="3.475547993s" podCreationTimestamp="2025-10-10 08:20:42 +0000 UTC" firstStartedPulling="2025-10-10 08:20:43.429288021 +0000 UTC m=+5370.498879262" lastFinishedPulling="2025-10-10 08:20:45.05100033 +0000 UTC m=+5372.120591611" observedRunningTime="2025-10-10 08:20:45.471967617 +0000 UTC m=+5372.541558878" watchObservedRunningTime="2025-10-10 08:20:45.475547993 +0000 UTC m=+5372.545139244" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.816933 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.919203 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9w4q\" (UniqueName: \"kubernetes.io/projected/1d08ea64-6030-4467-964d-f85c284a1a1b-kube-api-access-j9w4q\") pod \"1d08ea64-6030-4467-964d-f85c284a1a1b\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.919312 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-scripts\") pod \"1d08ea64-6030-4467-964d-f85c284a1a1b\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.919346 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-swiftconf\") pod \"1d08ea64-6030-4467-964d-f85c284a1a1b\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.919458 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-ring-data-devices\") pod \"1d08ea64-6030-4467-964d-f85c284a1a1b\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.919532 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-combined-ca-bundle\") pod \"1d08ea64-6030-4467-964d-f85c284a1a1b\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.919625 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-dispersionconf\") pod \"1d08ea64-6030-4467-964d-f85c284a1a1b\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.919650 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d08ea64-6030-4467-964d-f85c284a1a1b-etc-swift\") pod \"1d08ea64-6030-4467-964d-f85c284a1a1b\" (UID: \"1d08ea64-6030-4467-964d-f85c284a1a1b\") " Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.921524 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1d08ea64-6030-4467-964d-f85c284a1a1b" (UID: "1d08ea64-6030-4467-964d-f85c284a1a1b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.921923 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d08ea64-6030-4467-964d-f85c284a1a1b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1d08ea64-6030-4467-964d-f85c284a1a1b" (UID: "1d08ea64-6030-4467-964d-f85c284a1a1b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.926250 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d08ea64-6030-4467-964d-f85c284a1a1b-kube-api-access-j9w4q" (OuterVolumeSpecName: "kube-api-access-j9w4q") pod "1d08ea64-6030-4467-964d-f85c284a1a1b" (UID: "1d08ea64-6030-4467-964d-f85c284a1a1b"). InnerVolumeSpecName "kube-api-access-j9w4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.928770 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1d08ea64-6030-4467-964d-f85c284a1a1b" (UID: "1d08ea64-6030-4467-964d-f85c284a1a1b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.955027 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-scripts" (OuterVolumeSpecName: "scripts") pod "1d08ea64-6030-4467-964d-f85c284a1a1b" (UID: "1d08ea64-6030-4467-964d-f85c284a1a1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.972729 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d08ea64-6030-4467-964d-f85c284a1a1b" (UID: "1d08ea64-6030-4467-964d-f85c284a1a1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:45 crc kubenswrapper[4732]: I1010 08:20:45.983046 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1d08ea64-6030-4467-964d-f85c284a1a1b" (UID: "1d08ea64-6030-4467-964d-f85c284a1a1b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.022910 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.023068 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.023146 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.023209 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1d08ea64-6030-4467-964d-f85c284a1a1b-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.023264 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9w4q\" (UniqueName: \"kubernetes.io/projected/1d08ea64-6030-4467-964d-f85c284a1a1b-kube-api-access-j9w4q\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.023342 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d08ea64-6030-4467-964d-f85c284a1a1b-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.023398 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1d08ea64-6030-4467-964d-f85c284a1a1b-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.090239 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.153815 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-569996dfb5-2tpxw"] Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.154089 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" podUID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerName="dnsmasq-dns" containerID="cri-o://18ed10fd5fbde4ee2ef4a49ecef064e7eb77f3ddb82f4f5968063cdf460d23b0" gracePeriod=10 Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.465514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-swk7w" event={"ID":"1d08ea64-6030-4467-964d-f85c284a1a1b","Type":"ContainerDied","Data":"bd57f1685a85dee58c7dcad62ac0b2d7e758a6bfb6336083e560c141fd93cfda"} Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.465792 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd57f1685a85dee58c7dcad62ac0b2d7e758a6bfb6336083e560c141fd93cfda" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.465600 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-swk7w" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.467295 4732 generic.go:334] "Generic (PLEG): container finished" podID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerID="18ed10fd5fbde4ee2ef4a49ecef064e7eb77f3ddb82f4f5968063cdf460d23b0" exitCode=0 Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.467359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" event={"ID":"4d241f11-84b2-4b97-bcfa-3e0966513fbe","Type":"ContainerDied","Data":"18ed10fd5fbde4ee2ef4a49ecef064e7eb77f3ddb82f4f5968063cdf460d23b0"} Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.563358 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.633949 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-config\") pod \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.682720 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-config" (OuterVolumeSpecName: "config") pod "4d241f11-84b2-4b97-bcfa-3e0966513fbe" (UID: "4d241f11-84b2-4b97-bcfa-3e0966513fbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.735325 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-dns-svc\") pod \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.735417 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss6zb\" (UniqueName: \"kubernetes.io/projected/4d241f11-84b2-4b97-bcfa-3e0966513fbe-kube-api-access-ss6zb\") pod \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.735508 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-sb\") pod \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.735636 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-nb\") pod \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\" (UID: \"4d241f11-84b2-4b97-bcfa-3e0966513fbe\") " Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.736180 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.740300 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d241f11-84b2-4b97-bcfa-3e0966513fbe-kube-api-access-ss6zb" (OuterVolumeSpecName: "kube-api-access-ss6zb") pod "4d241f11-84b2-4b97-bcfa-3e0966513fbe" (UID: "4d241f11-84b2-4b97-bcfa-3e0966513fbe"). InnerVolumeSpecName "kube-api-access-ss6zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.785194 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d241f11-84b2-4b97-bcfa-3e0966513fbe" (UID: "4d241f11-84b2-4b97-bcfa-3e0966513fbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.788966 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d241f11-84b2-4b97-bcfa-3e0966513fbe" (UID: "4d241f11-84b2-4b97-bcfa-3e0966513fbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.789555 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d241f11-84b2-4b97-bcfa-3e0966513fbe" (UID: "4d241f11-84b2-4b97-bcfa-3e0966513fbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.837636 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.837673 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.837683 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss6zb\" (UniqueName: \"kubernetes.io/projected/4d241f11-84b2-4b97-bcfa-3e0966513fbe-kube-api-access-ss6zb\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:46 crc kubenswrapper[4732]: I1010 08:20:46.837706 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d241f11-84b2-4b97-bcfa-3e0966513fbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:47 crc kubenswrapper[4732]: I1010 08:20:47.478197 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" event={"ID":"4d241f11-84b2-4b97-bcfa-3e0966513fbe","Type":"ContainerDied","Data":"f7d6cb39dc46249956bf8e7caa228918abb85be83b48853fc17ff5531c96f512"} Oct 10 08:20:47 crc kubenswrapper[4732]: I1010 08:20:47.478256 4732 scope.go:117] "RemoveContainer" containerID="18ed10fd5fbde4ee2ef4a49ecef064e7eb77f3ddb82f4f5968063cdf460d23b0" Oct 10 08:20:47 crc kubenswrapper[4732]: I1010 08:20:47.478304 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-569996dfb5-2tpxw" Oct 10 08:20:47 crc kubenswrapper[4732]: I1010 08:20:47.517802 4732 scope.go:117] "RemoveContainer" containerID="69d2ee98b2dc0f3b3cd590b27bf4246fa79d155b5044415bdf1bc8363fea1ccc" Oct 10 08:20:47 crc kubenswrapper[4732]: I1010 08:20:47.527808 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-569996dfb5-2tpxw"] Oct 10 08:20:47 crc kubenswrapper[4732]: I1010 08:20:47.534474 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-569996dfb5-2tpxw"] Oct 10 08:20:47 crc kubenswrapper[4732]: I1010 08:20:47.672573 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" path="/var/lib/kubelet/pods/4d241f11-84b2-4b97-bcfa-3e0966513fbe/volumes" Oct 10 08:20:48 crc kubenswrapper[4732]: I1010 08:20:48.261422 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:48 crc kubenswrapper[4732]: I1010 08:20:48.263026 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:50 crc kubenswrapper[4732]: I1010 08:20:50.415070 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:50 crc kubenswrapper[4732]: I1010 08:20:50.516323 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5764484767-24vlx" Oct 10 08:20:50 crc kubenswrapper[4732]: I1010 08:20:50.614482 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7776658547-4fgmc"] Oct 10 08:20:50 crc kubenswrapper[4732]: I1010 08:20:50.614771 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7776658547-4fgmc" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-httpd" containerID="cri-o://a5d4e40b438a0aa8949423f9e8cb9c14aab33f4d719867b23c63dfa9dc537974" gracePeriod=30 Oct 10 08:20:50 crc kubenswrapper[4732]: I1010 08:20:50.615237 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7776658547-4fgmc" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-server" containerID="cri-o://6fa80bee506149becc656bb66c5b1e7d86bc0c28ba9db04dc011689a38ae3cb5" gracePeriod=30 Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.516305 4732 generic.go:334] "Generic (PLEG): container finished" podID="4556001e-ee45-4c64-84a0-317b90d1583e" containerID="6fa80bee506149becc656bb66c5b1e7d86bc0c28ba9db04dc011689a38ae3cb5" exitCode=0 Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.516560 4732 generic.go:334] "Generic (PLEG): container finished" podID="4556001e-ee45-4c64-84a0-317b90d1583e" containerID="a5d4e40b438a0aa8949423f9e8cb9c14aab33f4d719867b23c63dfa9dc537974" exitCode=0 Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.516590 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776658547-4fgmc" event={"ID":"4556001e-ee45-4c64-84a0-317b90d1583e","Type":"ContainerDied","Data":"6fa80bee506149becc656bb66c5b1e7d86bc0c28ba9db04dc011689a38ae3cb5"} Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.516616 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776658547-4fgmc" event={"ID":"4556001e-ee45-4c64-84a0-317b90d1583e","Type":"ContainerDied","Data":"a5d4e40b438a0aa8949423f9e8cb9c14aab33f4d719867b23c63dfa9dc537974"} Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.679491 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.846145 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-run-httpd\") pod \"4556001e-ee45-4c64-84a0-317b90d1583e\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.847015 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-combined-ca-bundle\") pod \"4556001e-ee45-4c64-84a0-317b90d1583e\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.846645 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4556001e-ee45-4c64-84a0-317b90d1583e" (UID: "4556001e-ee45-4c64-84a0-317b90d1583e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.847528 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sq57\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-kube-api-access-5sq57\") pod \"4556001e-ee45-4c64-84a0-317b90d1583e\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.848003 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-log-httpd\") pod \"4556001e-ee45-4c64-84a0-317b90d1583e\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.848194 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-etc-swift\") pod \"4556001e-ee45-4c64-84a0-317b90d1583e\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.848284 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4556001e-ee45-4c64-84a0-317b90d1583e" (UID: "4556001e-ee45-4c64-84a0-317b90d1583e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.848538 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-config-data\") pod \"4556001e-ee45-4c64-84a0-317b90d1583e\" (UID: \"4556001e-ee45-4c64-84a0-317b90d1583e\") " Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.851990 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.852258 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4556001e-ee45-4c64-84a0-317b90d1583e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.854063 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4556001e-ee45-4c64-84a0-317b90d1583e" (UID: "4556001e-ee45-4c64-84a0-317b90d1583e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.854240 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-kube-api-access-5sq57" (OuterVolumeSpecName: "kube-api-access-5sq57") pod "4556001e-ee45-4c64-84a0-317b90d1583e" (UID: "4556001e-ee45-4c64-84a0-317b90d1583e"). InnerVolumeSpecName "kube-api-access-5sq57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.911305 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4556001e-ee45-4c64-84a0-317b90d1583e" (UID: "4556001e-ee45-4c64-84a0-317b90d1583e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.918891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-config-data" (OuterVolumeSpecName: "config-data") pod "4556001e-ee45-4c64-84a0-317b90d1583e" (UID: "4556001e-ee45-4c64-84a0-317b90d1583e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.954066 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.954098 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.954107 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4556001e-ee45-4c64-84a0-317b90d1583e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:51 crc kubenswrapper[4732]: I1010 08:20:51.954122 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sq57\" (UniqueName: \"kubernetes.io/projected/4556001e-ee45-4c64-84a0-317b90d1583e-kube-api-access-5sq57\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.531678 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7776658547-4fgmc" event={"ID":"4556001e-ee45-4c64-84a0-317b90d1583e","Type":"ContainerDied","Data":"d3e5edf993ea4575b11b90639a3a066193321235595d746c37118fd359286b8f"} Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.531785 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7776658547-4fgmc" Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.531815 4732 scope.go:117] "RemoveContainer" containerID="6fa80bee506149becc656bb66c5b1e7d86bc0c28ba9db04dc011689a38ae3cb5" Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.573858 4732 scope.go:117] "RemoveContainer" containerID="a5d4e40b438a0aa8949423f9e8cb9c14aab33f4d719867b23c63dfa9dc537974" Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.592849 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7776658547-4fgmc"] Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.600824 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7776658547-4fgmc"] Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.680963 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.681281 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:52 crc kubenswrapper[4732]: I1010 08:20:52.737401 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:53 crc kubenswrapper[4732]: I1010 08:20:53.609757 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:53 crc kubenswrapper[4732]: I1010 08:20:53.658220 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rx4q4"] Oct 10 08:20:53 crc kubenswrapper[4732]: I1010 08:20:53.672561 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" path="/var/lib/kubelet/pods/4556001e-ee45-4c64-84a0-317b90d1583e/volumes" Oct 10 08:20:55 crc kubenswrapper[4732]: I1010 08:20:55.355658 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:20:55 crc kubenswrapper[4732]: I1010 08:20:55.355761 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:20:55 crc kubenswrapper[4732]: I1010 08:20:55.576835 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rx4q4" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="registry-server" containerID="cri-o://6c01a50ac43d638037de195c4d8954f53c7bf57fd9c8f8cf61211366b022e3b7" gracePeriod=2 Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.143695 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4jc2k"] Oct 10 08:20:56 crc kubenswrapper[4732]: E1010 08:20:56.144070 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d08ea64-6030-4467-964d-f85c284a1a1b" containerName="swift-ring-rebalance" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144086 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d08ea64-6030-4467-964d-f85c284a1a1b" containerName="swift-ring-rebalance" Oct 10 08:20:56 crc kubenswrapper[4732]: E1010 08:20:56.144111 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-httpd" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144120 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-httpd" Oct 10 08:20:56 crc kubenswrapper[4732]: E1010 08:20:56.144149 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-server" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144157 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-server" Oct 10 08:20:56 crc kubenswrapper[4732]: E1010 08:20:56.144167 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerName="dnsmasq-dns" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144173 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerName="dnsmasq-dns" Oct 10 08:20:56 crc kubenswrapper[4732]: E1010 08:20:56.144182 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerName="init" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144187 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerName="init" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144335 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d241f11-84b2-4b97-bcfa-3e0966513fbe" containerName="dnsmasq-dns" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144352 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-httpd" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144359 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4556001e-ee45-4c64-84a0-317b90d1583e" containerName="proxy-server" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.144377 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d08ea64-6030-4467-964d-f85c284a1a1b" containerName="swift-ring-rebalance" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.145027 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4jc2k" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.154732 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4jc2k"] Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.336518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdv6r\" (UniqueName: \"kubernetes.io/projected/d6a51b14-b879-47b4-9fae-efa0ec35db2f-kube-api-access-zdv6r\") pod \"cinder-db-create-4jc2k\" (UID: \"d6a51b14-b879-47b4-9fae-efa0ec35db2f\") " pod="openstack/cinder-db-create-4jc2k" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.439067 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdv6r\" (UniqueName: \"kubernetes.io/projected/d6a51b14-b879-47b4-9fae-efa0ec35db2f-kube-api-access-zdv6r\") pod \"cinder-db-create-4jc2k\" (UID: \"d6a51b14-b879-47b4-9fae-efa0ec35db2f\") " pod="openstack/cinder-db-create-4jc2k" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.457528 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdv6r\" (UniqueName: \"kubernetes.io/projected/d6a51b14-b879-47b4-9fae-efa0ec35db2f-kube-api-access-zdv6r\") pod \"cinder-db-create-4jc2k\" (UID: \"d6a51b14-b879-47b4-9fae-efa0ec35db2f\") " pod="openstack/cinder-db-create-4jc2k" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.464074 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4jc2k" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.599046 4732 generic.go:334] "Generic (PLEG): container finished" podID="330e92eb-e631-46de-8b50-c9183d81a284" containerID="6c01a50ac43d638037de195c4d8954f53c7bf57fd9c8f8cf61211366b022e3b7" exitCode=0 Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.599118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rx4q4" event={"ID":"330e92eb-e631-46de-8b50-c9183d81a284","Type":"ContainerDied","Data":"6c01a50ac43d638037de195c4d8954f53c7bf57fd9c8f8cf61211366b022e3b7"} Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.599437 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rx4q4" event={"ID":"330e92eb-e631-46de-8b50-c9183d81a284","Type":"ContainerDied","Data":"518930a1342c92b95e28ac8ec569554c8bad855edb8f0ba1a77d759cf0ea96ad"} Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.599457 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="518930a1342c92b95e28ac8ec569554c8bad855edb8f0ba1a77d759cf0ea96ad" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.600256 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.746278 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-utilities\") pod \"330e92eb-e631-46de-8b50-c9183d81a284\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.746476 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-catalog-content\") pod \"330e92eb-e631-46de-8b50-c9183d81a284\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.746576 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z5q5\" (UniqueName: \"kubernetes.io/projected/330e92eb-e631-46de-8b50-c9183d81a284-kube-api-access-6z5q5\") pod \"330e92eb-e631-46de-8b50-c9183d81a284\" (UID: \"330e92eb-e631-46de-8b50-c9183d81a284\") " Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.747672 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-utilities" (OuterVolumeSpecName: "utilities") pod "330e92eb-e631-46de-8b50-c9183d81a284" (UID: "330e92eb-e631-46de-8b50-c9183d81a284"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.757042 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330e92eb-e631-46de-8b50-c9183d81a284-kube-api-access-6z5q5" (OuterVolumeSpecName: "kube-api-access-6z5q5") pod "330e92eb-e631-46de-8b50-c9183d81a284" (UID: "330e92eb-e631-46de-8b50-c9183d81a284"). InnerVolumeSpecName "kube-api-access-6z5q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.764557 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "330e92eb-e631-46de-8b50-c9183d81a284" (UID: "330e92eb-e631-46de-8b50-c9183d81a284"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.849191 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.849239 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/330e92eb-e631-46de-8b50-c9183d81a284-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.849255 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z5q5\" (UniqueName: \"kubernetes.io/projected/330e92eb-e631-46de-8b50-c9183d81a284-kube-api-access-6z5q5\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:56 crc kubenswrapper[4732]: I1010 08:20:56.911670 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4jc2k"] Oct 10 08:20:57 crc kubenswrapper[4732]: I1010 08:20:57.623786 4732 generic.go:334] "Generic (PLEG): container finished" podID="d6a51b14-b879-47b4-9fae-efa0ec35db2f" containerID="2f3215762fcf4416eb0d4faa62ff1ca6bd6935426fb5aa9183eddcde9db80fe3" exitCode=0 Oct 10 08:20:57 crc kubenswrapper[4732]: I1010 08:20:57.623889 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4jc2k" event={"ID":"d6a51b14-b879-47b4-9fae-efa0ec35db2f","Type":"ContainerDied","Data":"2f3215762fcf4416eb0d4faa62ff1ca6bd6935426fb5aa9183eddcde9db80fe3"} Oct 10 08:20:57 crc kubenswrapper[4732]: I1010 08:20:57.624365 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4jc2k" event={"ID":"d6a51b14-b879-47b4-9fae-efa0ec35db2f","Type":"ContainerStarted","Data":"fc09d1ef7f970da6d8b77f6211c06037851da12d117b5ff225481e65ce21ea59"} Oct 10 08:20:57 crc kubenswrapper[4732]: I1010 08:20:57.624636 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rx4q4" Oct 10 08:20:57 crc kubenswrapper[4732]: I1010 08:20:57.684636 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rx4q4"] Oct 10 08:20:57 crc kubenswrapper[4732]: I1010 08:20:57.684732 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rx4q4"] Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.060366 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4jc2k" Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.189405 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdv6r\" (UniqueName: \"kubernetes.io/projected/d6a51b14-b879-47b4-9fae-efa0ec35db2f-kube-api-access-zdv6r\") pod \"d6a51b14-b879-47b4-9fae-efa0ec35db2f\" (UID: \"d6a51b14-b879-47b4-9fae-efa0ec35db2f\") " Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.194985 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a51b14-b879-47b4-9fae-efa0ec35db2f-kube-api-access-zdv6r" (OuterVolumeSpecName: "kube-api-access-zdv6r") pod "d6a51b14-b879-47b4-9fae-efa0ec35db2f" (UID: "d6a51b14-b879-47b4-9fae-efa0ec35db2f"). InnerVolumeSpecName "kube-api-access-zdv6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.292075 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdv6r\" (UniqueName: \"kubernetes.io/projected/d6a51b14-b879-47b4-9fae-efa0ec35db2f-kube-api-access-zdv6r\") on node \"crc\" DevicePath \"\"" Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.644124 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4jc2k" event={"ID":"d6a51b14-b879-47b4-9fae-efa0ec35db2f","Type":"ContainerDied","Data":"fc09d1ef7f970da6d8b77f6211c06037851da12d117b5ff225481e65ce21ea59"} Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.644433 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc09d1ef7f970da6d8b77f6211c06037851da12d117b5ff225481e65ce21ea59" Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.644221 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4jc2k" Oct 10 08:20:59 crc kubenswrapper[4732]: I1010 08:20:59.675963 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330e92eb-e631-46de-8b50-c9183d81a284" path="/var/lib/kubelet/pods/330e92eb-e631-46de-8b50-c9183d81a284/volumes" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.235413 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e330-account-create-p9ll6"] Oct 10 08:21:06 crc kubenswrapper[4732]: E1010 08:21:06.236622 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a51b14-b879-47b4-9fae-efa0ec35db2f" containerName="mariadb-database-create" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.236647 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a51b14-b879-47b4-9fae-efa0ec35db2f" containerName="mariadb-database-create" Oct 10 08:21:06 crc kubenswrapper[4732]: E1010 08:21:06.236678 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="extract-utilities" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.236687 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="extract-utilities" Oct 10 08:21:06 crc kubenswrapper[4732]: E1010 08:21:06.236756 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="registry-server" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.236768 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="registry-server" Oct 10 08:21:06 crc kubenswrapper[4732]: E1010 08:21:06.236779 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="extract-content" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.236787 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="extract-content" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.236998 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="330e92eb-e631-46de-8b50-c9183d81a284" containerName="registry-server" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.237026 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a51b14-b879-47b4-9fae-efa0ec35db2f" containerName="mariadb-database-create" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.237857 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e330-account-create-p9ll6" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.240517 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.251596 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e330-account-create-p9ll6"] Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.331663 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7np\" (UniqueName: \"kubernetes.io/projected/a52aa122-1110-42ef-90b9-2fff969a770b-kube-api-access-pn7np\") pod \"cinder-e330-account-create-p9ll6\" (UID: \"a52aa122-1110-42ef-90b9-2fff969a770b\") " pod="openstack/cinder-e330-account-create-p9ll6" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.434103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7np\" (UniqueName: \"kubernetes.io/projected/a52aa122-1110-42ef-90b9-2fff969a770b-kube-api-access-pn7np\") pod \"cinder-e330-account-create-p9ll6\" (UID: \"a52aa122-1110-42ef-90b9-2fff969a770b\") " pod="openstack/cinder-e330-account-create-p9ll6" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.454049 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7np\" (UniqueName: \"kubernetes.io/projected/a52aa122-1110-42ef-90b9-2fff969a770b-kube-api-access-pn7np\") pod \"cinder-e330-account-create-p9ll6\" (UID: \"a52aa122-1110-42ef-90b9-2fff969a770b\") " pod="openstack/cinder-e330-account-create-p9ll6" Oct 10 08:21:06 crc kubenswrapper[4732]: I1010 08:21:06.580837 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e330-account-create-p9ll6" Oct 10 08:21:07 crc kubenswrapper[4732]: I1010 08:21:07.028306 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e330-account-create-p9ll6"] Oct 10 08:21:07 crc kubenswrapper[4732]: I1010 08:21:07.712286 4732 generic.go:334] "Generic (PLEG): container finished" podID="a52aa122-1110-42ef-90b9-2fff969a770b" containerID="2aa093add240e8511c83d6b028157319f00bd051a5e9cb68dd4aa4480811b567" exitCode=0 Oct 10 08:21:07 crc kubenswrapper[4732]: I1010 08:21:07.712362 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e330-account-create-p9ll6" event={"ID":"a52aa122-1110-42ef-90b9-2fff969a770b","Type":"ContainerDied","Data":"2aa093add240e8511c83d6b028157319f00bd051a5e9cb68dd4aa4480811b567"} Oct 10 08:21:07 crc kubenswrapper[4732]: I1010 08:21:07.712631 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e330-account-create-p9ll6" event={"ID":"a52aa122-1110-42ef-90b9-2fff969a770b","Type":"ContainerStarted","Data":"0ffe2dfc7b278b1b2b39d09307573c3a44b3f5d10061a8653d195138932e30a9"} Oct 10 08:21:09 crc kubenswrapper[4732]: I1010 08:21:09.099567 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e330-account-create-p9ll6" Oct 10 08:21:09 crc kubenswrapper[4732]: I1010 08:21:09.195258 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn7np\" (UniqueName: \"kubernetes.io/projected/a52aa122-1110-42ef-90b9-2fff969a770b-kube-api-access-pn7np\") pod \"a52aa122-1110-42ef-90b9-2fff969a770b\" (UID: \"a52aa122-1110-42ef-90b9-2fff969a770b\") " Oct 10 08:21:09 crc kubenswrapper[4732]: I1010 08:21:09.201067 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52aa122-1110-42ef-90b9-2fff969a770b-kube-api-access-pn7np" (OuterVolumeSpecName: "kube-api-access-pn7np") pod "a52aa122-1110-42ef-90b9-2fff969a770b" (UID: "a52aa122-1110-42ef-90b9-2fff969a770b"). InnerVolumeSpecName "kube-api-access-pn7np". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:21:09 crc kubenswrapper[4732]: I1010 08:21:09.298167 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn7np\" (UniqueName: \"kubernetes.io/projected/a52aa122-1110-42ef-90b9-2fff969a770b-kube-api-access-pn7np\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:09 crc kubenswrapper[4732]: I1010 08:21:09.730915 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e330-account-create-p9ll6" event={"ID":"a52aa122-1110-42ef-90b9-2fff969a770b","Type":"ContainerDied","Data":"0ffe2dfc7b278b1b2b39d09307573c3a44b3f5d10061a8653d195138932e30a9"} Oct 10 08:21:09 crc kubenswrapper[4732]: I1010 08:21:09.730952 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ffe2dfc7b278b1b2b39d09307573c3a44b3f5d10061a8653d195138932e30a9" Oct 10 08:21:09 crc kubenswrapper[4732]: I1010 08:21:09.731007 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e330-account-create-p9ll6" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.371325 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9hfhx"] Oct 10 08:21:11 crc kubenswrapper[4732]: E1010 08:21:11.371936 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52aa122-1110-42ef-90b9-2fff969a770b" containerName="mariadb-account-create" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.371948 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52aa122-1110-42ef-90b9-2fff969a770b" containerName="mariadb-account-create" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.372115 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52aa122-1110-42ef-90b9-2fff969a770b" containerName="mariadb-account-create" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.372712 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.374781 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9cnkg" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.375945 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.376798 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.389943 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9hfhx"] Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.440102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-scripts\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.440191 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-config-data\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.440265 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhkqq\" (UniqueName: \"kubernetes.io/projected/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-kube-api-access-vhkqq\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.440293 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-etc-machine-id\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.440315 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-db-sync-config-data\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.440492 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-combined-ca-bundle\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.542255 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-db-sync-config-data\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.542353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-combined-ca-bundle\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.542403 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-scripts\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.542452 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-config-data\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.542502 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkqq\" (UniqueName: \"kubernetes.io/projected/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-kube-api-access-vhkqq\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.542521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-etc-machine-id\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.542582 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-etc-machine-id\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.547577 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-db-sync-config-data\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.548764 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-scripts\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.549309 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-combined-ca-bundle\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.554664 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-config-data\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.560820 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhkqq\" (UniqueName: \"kubernetes.io/projected/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-kube-api-access-vhkqq\") pod \"cinder-db-sync-9hfhx\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:11 crc kubenswrapper[4732]: I1010 08:21:11.695435 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:12 crc kubenswrapper[4732]: I1010 08:21:12.177384 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9hfhx"] Oct 10 08:21:12 crc kubenswrapper[4732]: I1010 08:21:12.763107 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9hfhx" event={"ID":"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb","Type":"ContainerStarted","Data":"628c9b51a9787294429db760016afec8034371fade4467d74eb8b4aee59c5778"} Oct 10 08:21:25 crc kubenswrapper[4732]: I1010 08:21:25.356093 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:21:25 crc kubenswrapper[4732]: I1010 08:21:25.356901 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:21:26 crc kubenswrapper[4732]: I1010 08:21:26.059434 4732 scope.go:117] "RemoveContainer" containerID="7c9defd0248280428792c98933b9b7afc1b84199fe82ae3934a01105b1adb60f" Oct 10 08:21:31 crc kubenswrapper[4732]: I1010 08:21:31.862271 4732 scope.go:117] "RemoveContainer" containerID="e469593e2af201ad667e13f8d566d1d6d25fbd584a1ba65098870622281670ff" Oct 10 08:21:33 crc kubenswrapper[4732]: E1010 08:21:33.138172 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c4b77291aeca5591ac860bd4127cec2f" Oct 10 08:21:33 crc kubenswrapper[4732]: E1010 08:21:33.138257 4732 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c4b77291aeca5591ac860bd4127cec2f" Oct 10 08:21:33 crc kubenswrapper[4732]: E1010 08:21:33.138467 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c4b77291aeca5591ac860bd4127cec2f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhkqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9hfhx_openstack(0ef503bf-2e69-4fdc-89d6-0ab09f364bbb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 08:21:33 crc kubenswrapper[4732]: E1010 08:21:33.139744 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9hfhx" podUID="0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" Oct 10 08:21:33 crc kubenswrapper[4732]: E1010 08:21:33.959623 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:c4b77291aeca5591ac860bd4127cec2f\\\"\"" pod="openstack/cinder-db-sync-9hfhx" podUID="0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" Oct 10 08:21:48 crc kubenswrapper[4732]: I1010 08:21:48.108827 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9hfhx" event={"ID":"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb","Type":"ContainerStarted","Data":"f94df7a0d8fc8d013190033d8a5b7e6b4cf69aaeeb8cf4b878c8bb8fd03b84ed"} Oct 10 08:21:48 crc kubenswrapper[4732]: I1010 08:21:48.145980 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9hfhx" podStartSLOduration=2.41851895 podStartE2EDuration="37.145944632s" podCreationTimestamp="2025-10-10 08:21:11 +0000 UTC" firstStartedPulling="2025-10-10 08:21:12.174181743 +0000 UTC m=+5399.243772984" lastFinishedPulling="2025-10-10 08:21:46.901607425 +0000 UTC m=+5433.971198666" observedRunningTime="2025-10-10 08:21:48.137424742 +0000 UTC m=+5435.207016064" watchObservedRunningTime="2025-10-10 08:21:48.145944632 +0000 UTC m=+5435.215535913" Oct 10 08:21:52 crc kubenswrapper[4732]: I1010 08:21:52.143676 4732 generic.go:334] "Generic (PLEG): container finished" podID="0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" containerID="f94df7a0d8fc8d013190033d8a5b7e6b4cf69aaeeb8cf4b878c8bb8fd03b84ed" exitCode=0 Oct 10 08:21:52 crc kubenswrapper[4732]: I1010 08:21:52.143788 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9hfhx" event={"ID":"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb","Type":"ContainerDied","Data":"f94df7a0d8fc8d013190033d8a5b7e6b4cf69aaeeb8cf4b878c8bb8fd03b84ed"} Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.524606 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.598485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-config-data\") pod \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.598540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-db-sync-config-data\") pod \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.598621 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-combined-ca-bundle\") pod \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.598899 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-etc-machine-id\") pod \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.599017 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhkqq\" (UniqueName: \"kubernetes.io/projected/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-kube-api-access-vhkqq\") pod \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.599077 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-scripts\") pod \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\" (UID: \"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb\") " Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.599091 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" (UID: "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.600042 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.604243 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-scripts" (OuterVolumeSpecName: "scripts") pod "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" (UID: "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.604319 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-kube-api-access-vhkqq" (OuterVolumeSpecName: "kube-api-access-vhkqq") pod "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" (UID: "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb"). InnerVolumeSpecName "kube-api-access-vhkqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.604510 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" (UID: "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.629024 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" (UID: "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.676767 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-config-data" (OuterVolumeSpecName: "config-data") pod "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" (UID: "0ef503bf-2e69-4fdc-89d6-0ab09f364bbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.701578 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhkqq\" (UniqueName: \"kubernetes.io/projected/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-kube-api-access-vhkqq\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.701618 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.701636 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.701655 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:53.701672 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.172716 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9hfhx" event={"ID":"0ef503bf-2e69-4fdc-89d6-0ab09f364bbb","Type":"ContainerDied","Data":"628c9b51a9787294429db760016afec8034371fade4467d74eb8b4aee59c5778"} Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.173033 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="628c9b51a9787294429db760016afec8034371fade4467d74eb8b4aee59c5778" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.172777 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9hfhx" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.507380 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fbf48bcd7-b7vc5"] Oct 10 08:21:54 crc kubenswrapper[4732]: E1010 08:21:54.507743 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" containerName="cinder-db-sync" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.507758 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" containerName="cinder-db-sync" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.507924 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" containerName="cinder-db-sync" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.508846 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.523203 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fbf48bcd7-b7vc5"] Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.609956 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.611556 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618152 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618316 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9cnkg" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618421 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618665 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618813 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-config\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-nb\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618934 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xppxz\" (UniqueName: \"kubernetes.io/projected/64e501ef-9f03-4ca9-b9f7-425bb81a3435-kube-api-access-xppxz\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618961 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-sb\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.618983 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-dns-svc\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.621140 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.720892 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-nb\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.720960 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xppxz\" (UniqueName: \"kubernetes.io/projected/64e501ef-9f03-4ca9-b9f7-425bb81a3435-kube-api-access-xppxz\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.720985 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e58310-13f9-4cb1-9dda-5c012d2569a9-logs\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721047 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-sb\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721070 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-dns-svc\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721092 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data-custom\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721357 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8e58310-13f9-4cb1-9dda-5c012d2569a9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721440 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht4dw\" (UniqueName: \"kubernetes.io/projected/c8e58310-13f9-4cb1-9dda-5c012d2569a9-kube-api-access-ht4dw\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721469 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-scripts\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721566 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.721637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-config\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.722053 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-sb\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.722117 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-dns-svc\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.722468 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-config\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.723835 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-nb\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.740786 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xppxz\" (UniqueName: \"kubernetes.io/projected/64e501ef-9f03-4ca9-b9f7-425bb81a3435-kube-api-access-xppxz\") pod \"dnsmasq-dns-6fbf48bcd7-b7vc5\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.822516 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823134 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8e58310-13f9-4cb1-9dda-5c012d2569a9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823179 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht4dw\" (UniqueName: \"kubernetes.io/projected/c8e58310-13f9-4cb1-9dda-5c012d2569a9-kube-api-access-ht4dw\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823212 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-scripts\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823230 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8e58310-13f9-4cb1-9dda-5c012d2569a9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823381 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823486 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823525 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e58310-13f9-4cb1-9dda-5c012d2569a9-logs\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.823564 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data-custom\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.825230 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e58310-13f9-4cb1-9dda-5c012d2569a9-logs\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.826624 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.828276 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data-custom\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.829682 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.833455 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-scripts\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.841157 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht4dw\" (UniqueName: \"kubernetes.io/projected/c8e58310-13f9-4cb1-9dda-5c012d2569a9-kube-api-access-ht4dw\") pod \"cinder-api-0\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " pod="openstack/cinder-api-0" Oct 10 08:21:54 crc kubenswrapper[4732]: I1010 08:21:54.943417 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:21:55 crc kubenswrapper[4732]: I1010 08:21:55.263833 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fbf48bcd7-b7vc5"] Oct 10 08:21:55 crc kubenswrapper[4732]: I1010 08:21:55.356022 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:21:55 crc kubenswrapper[4732]: I1010 08:21:55.356083 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:21:55 crc kubenswrapper[4732]: I1010 08:21:55.356139 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:21:55 crc kubenswrapper[4732]: I1010 08:21:55.356879 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7df48300028f267e40178e485796865eb5f10b524ed9fcd0a9aaeef67e08b38f"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:21:55 crc kubenswrapper[4732]: I1010 08:21:55.356941 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://7df48300028f267e40178e485796865eb5f10b524ed9fcd0a9aaeef67e08b38f" gracePeriod=600 Oct 10 08:21:55 crc kubenswrapper[4732]: I1010 08:21:55.428910 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:55 crc kubenswrapper[4732]: W1010 08:21:55.431375 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8e58310_13f9_4cb1_9dda_5c012d2569a9.slice/crio-b26200b2d2bba99f5e4d26a2fe865a0e2b8086767550231f3c4165ed0027cf5c WatchSource:0}: Error finding container b26200b2d2bba99f5e4d26a2fe865a0e2b8086767550231f3c4165ed0027cf5c: Status 404 returned error can't find the container with id b26200b2d2bba99f5e4d26a2fe865a0e2b8086767550231f3c4165ed0027cf5c Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.214181 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="7df48300028f267e40178e485796865eb5f10b524ed9fcd0a9aaeef67e08b38f" exitCode=0 Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.214733 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"7df48300028f267e40178e485796865eb5f10b524ed9fcd0a9aaeef67e08b38f"} Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.214764 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1"} Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.214802 4732 scope.go:117] "RemoveContainer" containerID="f9ccd14bb02fa809c0f9809a52e17e211386e72e9c33b7b7b37f45d3c20e09ea" Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.242528 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8e58310-13f9-4cb1-9dda-5c012d2569a9","Type":"ContainerStarted","Data":"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0"} Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.242571 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8e58310-13f9-4cb1-9dda-5c012d2569a9","Type":"ContainerStarted","Data":"b26200b2d2bba99f5e4d26a2fe865a0e2b8086767550231f3c4165ed0027cf5c"} Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.252101 4732 generic.go:334] "Generic (PLEG): container finished" podID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerID="4ee06c0a96b42def01f7af723f735815bd9f8b3e9715b8944193e295421f02b8" exitCode=0 Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.252245 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" event={"ID":"64e501ef-9f03-4ca9-b9f7-425bb81a3435","Type":"ContainerDied","Data":"4ee06c0a96b42def01f7af723f735815bd9f8b3e9715b8944193e295421f02b8"} Oct 10 08:21:56 crc kubenswrapper[4732]: I1010 08:21:56.252310 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" event={"ID":"64e501ef-9f03-4ca9-b9f7-425bb81a3435","Type":"ContainerStarted","Data":"fffa92627b15b02f90a500e6ddc93521e7b739819176feffe282a5a5042811d9"} Oct 10 08:21:57 crc kubenswrapper[4732]: I1010 08:21:57.255542 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:57 crc kubenswrapper[4732]: I1010 08:21:57.264974 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8e58310-13f9-4cb1-9dda-5c012d2569a9","Type":"ContainerStarted","Data":"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05"} Oct 10 08:21:57 crc kubenswrapper[4732]: I1010 08:21:57.265663 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 08:21:57 crc kubenswrapper[4732]: I1010 08:21:57.267020 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" event={"ID":"64e501ef-9f03-4ca9-b9f7-425bb81a3435","Type":"ContainerStarted","Data":"8102c3a5a8110272ae763a3f38ed415884058b0c397765ee61db62162e2ec321"} Oct 10 08:21:57 crc kubenswrapper[4732]: I1010 08:21:57.267384 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:21:57 crc kubenswrapper[4732]: I1010 08:21:57.282942 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.282920818 podStartE2EDuration="3.282920818s" podCreationTimestamp="2025-10-10 08:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:21:57.281351136 +0000 UTC m=+5444.350942377" watchObservedRunningTime="2025-10-10 08:21:57.282920818 +0000 UTC m=+5444.352512059" Oct 10 08:21:57 crc kubenswrapper[4732]: I1010 08:21:57.304108 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" podStartSLOduration=3.304090839 podStartE2EDuration="3.304090839s" podCreationTimestamp="2025-10-10 08:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:21:57.297411259 +0000 UTC m=+5444.367002500" watchObservedRunningTime="2025-10-10 08:21:57.304090839 +0000 UTC m=+5444.373682080" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.275054 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api-log" containerID="cri-o://17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0" gracePeriod=30 Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.275895 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api" containerID="cri-o://4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05" gracePeriod=30 Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.796857 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.807020 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-combined-ca-bundle\") pod \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.807074 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e58310-13f9-4cb1-9dda-5c012d2569a9-logs\") pod \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.807108 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht4dw\" (UniqueName: \"kubernetes.io/projected/c8e58310-13f9-4cb1-9dda-5c012d2569a9-kube-api-access-ht4dw\") pod \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.807134 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data\") pod \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.807181 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8e58310-13f9-4cb1-9dda-5c012d2569a9-etc-machine-id\") pod \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.807203 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data-custom\") pod \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.807241 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-scripts\") pod \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\" (UID: \"c8e58310-13f9-4cb1-9dda-5c012d2569a9\") " Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.808452 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e58310-13f9-4cb1-9dda-5c012d2569a9-logs" (OuterVolumeSpecName: "logs") pod "c8e58310-13f9-4cb1-9dda-5c012d2569a9" (UID: "c8e58310-13f9-4cb1-9dda-5c012d2569a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.808446 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8e58310-13f9-4cb1-9dda-5c012d2569a9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c8e58310-13f9-4cb1-9dda-5c012d2569a9" (UID: "c8e58310-13f9-4cb1-9dda-5c012d2569a9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.816045 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-scripts" (OuterVolumeSpecName: "scripts") pod "c8e58310-13f9-4cb1-9dda-5c012d2569a9" (UID: "c8e58310-13f9-4cb1-9dda-5c012d2569a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.825947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e58310-13f9-4cb1-9dda-5c012d2569a9-kube-api-access-ht4dw" (OuterVolumeSpecName: "kube-api-access-ht4dw") pod "c8e58310-13f9-4cb1-9dda-5c012d2569a9" (UID: "c8e58310-13f9-4cb1-9dda-5c012d2569a9"). InnerVolumeSpecName "kube-api-access-ht4dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.830825 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c8e58310-13f9-4cb1-9dda-5c012d2569a9" (UID: "c8e58310-13f9-4cb1-9dda-5c012d2569a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.858090 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8e58310-13f9-4cb1-9dda-5c012d2569a9" (UID: "c8e58310-13f9-4cb1-9dda-5c012d2569a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.886001 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data" (OuterVolumeSpecName: "config-data") pod "c8e58310-13f9-4cb1-9dda-5c012d2569a9" (UID: "c8e58310-13f9-4cb1-9dda-5c012d2569a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.909263 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht4dw\" (UniqueName: \"kubernetes.io/projected/c8e58310-13f9-4cb1-9dda-5c012d2569a9-kube-api-access-ht4dw\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.909304 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.909314 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8e58310-13f9-4cb1-9dda-5c012d2569a9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.909322 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.909331 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.909339 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e58310-13f9-4cb1-9dda-5c012d2569a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:58 crc kubenswrapper[4732]: I1010 08:21:58.909348 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e58310-13f9-4cb1-9dda-5c012d2569a9-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.288857 4732 generic.go:334] "Generic (PLEG): container finished" podID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerID="4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05" exitCode=0 Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.288889 4732 generic.go:334] "Generic (PLEG): container finished" podID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerID="17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0" exitCode=143 Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.288909 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8e58310-13f9-4cb1-9dda-5c012d2569a9","Type":"ContainerDied","Data":"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05"} Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.288934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8e58310-13f9-4cb1-9dda-5c012d2569a9","Type":"ContainerDied","Data":"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0"} Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.288944 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c8e58310-13f9-4cb1-9dda-5c012d2569a9","Type":"ContainerDied","Data":"b26200b2d2bba99f5e4d26a2fe865a0e2b8086767550231f3c4165ed0027cf5c"} Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.288958 4732 scope.go:117] "RemoveContainer" containerID="4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.289071 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.333562 4732 scope.go:117] "RemoveContainer" containerID="17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.333576 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.356447 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.368328 4732 scope.go:117] "RemoveContainer" containerID="4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05" Oct 10 08:21:59 crc kubenswrapper[4732]: E1010 08:21:59.368873 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05\": container with ID starting with 4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05 not found: ID does not exist" containerID="4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.368919 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05"} err="failed to get container status \"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05\": rpc error: code = NotFound desc = could not find container \"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05\": container with ID starting with 4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05 not found: ID does not exist" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.368944 4732 scope.go:117] "RemoveContainer" containerID="17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0" Oct 10 08:21:59 crc kubenswrapper[4732]: E1010 08:21:59.369274 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0\": container with ID starting with 17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0 not found: ID does not exist" containerID="17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.369301 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0"} err="failed to get container status \"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0\": rpc error: code = NotFound desc = could not find container \"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0\": container with ID starting with 17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0 not found: ID does not exist" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.369318 4732 scope.go:117] "RemoveContainer" containerID="4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.369541 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05"} err="failed to get container status \"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05\": rpc error: code = NotFound desc = could not find container \"4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05\": container with ID starting with 4edf4ff4fba6e58cf85209e6fd4110958af84a3a37642f1b7fdee0361ccd3c05 not found: ID does not exist" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.369565 4732 scope.go:117] "RemoveContainer" containerID="17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.369761 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0"} err="failed to get container status \"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0\": rpc error: code = NotFound desc = could not find container \"17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0\": container with ID starting with 17a0946b5c3d67277e073e686e3324bbb55b8a52f44f1bbe461c0670934ad8a0 not found: ID does not exist" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.402836 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:59 crc kubenswrapper[4732]: E1010 08:21:59.403306 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.403329 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api" Oct 10 08:21:59 crc kubenswrapper[4732]: E1010 08:21:59.403347 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api-log" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.403356 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api-log" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.403629 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.403656 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" containerName="cinder-api-log" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.404993 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.409277 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.409553 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9cnkg" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.409752 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.409988 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.410116 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.410227 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.423881 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.518233 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.518798 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.518898 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfv5r\" (UniqueName: \"kubernetes.io/projected/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-kube-api-access-wfv5r\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.519052 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data-custom\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.519177 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.519247 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.519306 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-scripts\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.519562 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.519592 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-logs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.621818 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.621910 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.621952 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfv5r\" (UniqueName: \"kubernetes.io/projected/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-kube-api-access-wfv5r\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.622053 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data-custom\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.622084 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.622122 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.622180 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-scripts\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.622275 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-logs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.622328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.622568 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-etc-machine-id\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.623241 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-logs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.629664 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-public-tls-certs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.629883 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.631659 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.632792 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data-custom\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.634788 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-scripts\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.645643 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.645982 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfv5r\" (UniqueName: \"kubernetes.io/projected/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-kube-api-access-wfv5r\") pod \"cinder-api-0\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " pod="openstack/cinder-api-0" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.669179 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e58310-13f9-4cb1-9dda-5c012d2569a9" path="/var/lib/kubelet/pods/c8e58310-13f9-4cb1-9dda-5c012d2569a9/volumes" Oct 10 08:21:59 crc kubenswrapper[4732]: I1010 08:21:59.751311 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:22:00 crc kubenswrapper[4732]: I1010 08:22:00.232415 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:22:00 crc kubenswrapper[4732]: W1010 08:22:00.240141 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85c3a2a2_fe9a_40d4_9356_4a6c7d7e5baf.slice/crio-0465b4761ff491facfcdece095c0e8f8b980fd9b4d6a7907c39a7c2373480d05 WatchSource:0}: Error finding container 0465b4761ff491facfcdece095c0e8f8b980fd9b4d6a7907c39a7c2373480d05: Status 404 returned error can't find the container with id 0465b4761ff491facfcdece095c0e8f8b980fd9b4d6a7907c39a7c2373480d05 Oct 10 08:22:00 crc kubenswrapper[4732]: I1010 08:22:00.300892 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf","Type":"ContainerStarted","Data":"0465b4761ff491facfcdece095c0e8f8b980fd9b4d6a7907c39a7c2373480d05"} Oct 10 08:22:04 crc kubenswrapper[4732]: I1010 08:22:04.352628 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf","Type":"ContainerStarted","Data":"dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68"} Oct 10 08:22:04 crc kubenswrapper[4732]: I1010 08:22:04.353293 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 08:22:04 crc kubenswrapper[4732]: I1010 08:22:04.353310 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf","Type":"ContainerStarted","Data":"a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703"} Oct 10 08:22:04 crc kubenswrapper[4732]: I1010 08:22:04.825374 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:22:04 crc kubenswrapper[4732]: I1010 08:22:04.863443 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.863412214 podStartE2EDuration="5.863412214s" podCreationTimestamp="2025-10-10 08:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:22:04.371993007 +0000 UTC m=+5451.441584268" watchObservedRunningTime="2025-10-10 08:22:04.863412214 +0000 UTC m=+5451.933003475" Oct 10 08:22:04 crc kubenswrapper[4732]: I1010 08:22:04.909275 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c48b4dd5-8xlrw"] Oct 10 08:22:04 crc kubenswrapper[4732]: I1010 08:22:04.909481 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" podUID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerName="dnsmasq-dns" containerID="cri-o://b12c3691d0cdba7073f7a1a1f7d9d957d30652b02a5c306783cef110128e1889" gracePeriod=10 Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.363211 4732 generic.go:334] "Generic (PLEG): container finished" podID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerID="b12c3691d0cdba7073f7a1a1f7d9d957d30652b02a5c306783cef110128e1889" exitCode=0 Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.363817 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" event={"ID":"d54d4b7e-2314-4610-bfe8-8faa5f0406d8","Type":"ContainerDied","Data":"b12c3691d0cdba7073f7a1a1f7d9d957d30652b02a5c306783cef110128e1889"} Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.363875 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" event={"ID":"d54d4b7e-2314-4610-bfe8-8faa5f0406d8","Type":"ContainerDied","Data":"249304babd57c03cf8da4f09e8bcfbe2245369f62aed3882d6be3ab7ab8be21a"} Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.363888 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="249304babd57c03cf8da4f09e8bcfbe2245369f62aed3882d6be3ab7ab8be21a" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.405023 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.539003 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-sb\") pod \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.539088 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-config\") pod \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.539127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-dns-svc\") pod \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.539246 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-nb\") pod \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.539304 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4p7t\" (UniqueName: \"kubernetes.io/projected/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-kube-api-access-x4p7t\") pod \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\" (UID: \"d54d4b7e-2314-4610-bfe8-8faa5f0406d8\") " Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.568643 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-kube-api-access-x4p7t" (OuterVolumeSpecName: "kube-api-access-x4p7t") pod "d54d4b7e-2314-4610-bfe8-8faa5f0406d8" (UID: "d54d4b7e-2314-4610-bfe8-8faa5f0406d8"). InnerVolumeSpecName "kube-api-access-x4p7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.594639 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d54d4b7e-2314-4610-bfe8-8faa5f0406d8" (UID: "d54d4b7e-2314-4610-bfe8-8faa5f0406d8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.596824 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-config" (OuterVolumeSpecName: "config") pod "d54d4b7e-2314-4610-bfe8-8faa5f0406d8" (UID: "d54d4b7e-2314-4610-bfe8-8faa5f0406d8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.598301 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d54d4b7e-2314-4610-bfe8-8faa5f0406d8" (UID: "d54d4b7e-2314-4610-bfe8-8faa5f0406d8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.600465 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d54d4b7e-2314-4610-bfe8-8faa5f0406d8" (UID: "d54d4b7e-2314-4610-bfe8-8faa5f0406d8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.641621 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.641661 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4p7t\" (UniqueName: \"kubernetes.io/projected/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-kube-api-access-x4p7t\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.641676 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.641687 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:05 crc kubenswrapper[4732]: I1010 08:22:05.641723 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54d4b7e-2314-4610-bfe8-8faa5f0406d8-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:06 crc kubenswrapper[4732]: I1010 08:22:06.371469 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c48b4dd5-8xlrw" Oct 10 08:22:06 crc kubenswrapper[4732]: I1010 08:22:06.397375 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c48b4dd5-8xlrw"] Oct 10 08:22:06 crc kubenswrapper[4732]: I1010 08:22:06.403597 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c48b4dd5-8xlrw"] Oct 10 08:22:07 crc kubenswrapper[4732]: I1010 08:22:07.672973 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" path="/var/lib/kubelet/pods/d54d4b7e-2314-4610-bfe8-8faa5f0406d8/volumes" Oct 10 08:22:11 crc kubenswrapper[4732]: I1010 08:22:11.592833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.210422 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:28 crc kubenswrapper[4732]: E1010 08:22:28.211389 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerName="dnsmasq-dns" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.211404 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerName="dnsmasq-dns" Oct 10 08:22:28 crc kubenswrapper[4732]: E1010 08:22:28.211428 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerName="init" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.211436 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerName="init" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.211665 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54d4b7e-2314-4610-bfe8-8faa5f0406d8" containerName="dnsmasq-dns" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.212812 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.218946 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.235468 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.277833 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n8jf\" (UniqueName: \"kubernetes.io/projected/204a7c7c-d10f-4402-afa2-59dc26b84240-kube-api-access-4n8jf\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.278069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.278119 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.278162 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204a7c7c-d10f-4402-afa2-59dc26b84240-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.278190 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-scripts\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.278221 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.379467 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.379523 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204a7c7c-d10f-4402-afa2-59dc26b84240-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.379551 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-scripts\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.379581 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.379644 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n8jf\" (UniqueName: \"kubernetes.io/projected/204a7c7c-d10f-4402-afa2-59dc26b84240-kube-api-access-4n8jf\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.379661 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.380628 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204a7c7c-d10f-4402-afa2-59dc26b84240-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.386313 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.386629 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-scripts\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.386955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.387168 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.396064 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n8jf\" (UniqueName: \"kubernetes.io/projected/204a7c7c-d10f-4402-afa2-59dc26b84240-kube-api-access-4n8jf\") pod \"cinder-scheduler-0\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:28 crc kubenswrapper[4732]: I1010 08:22:28.541310 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:22:29 crc kubenswrapper[4732]: I1010 08:22:29.001669 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:29 crc kubenswrapper[4732]: W1010 08:22:29.018908 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204a7c7c_d10f_4402_afa2_59dc26b84240.slice/crio-417df0b0ee62d38a991080337e93689dadaaab8975e81adc0d1a9370b45ed141 WatchSource:0}: Error finding container 417df0b0ee62d38a991080337e93689dadaaab8975e81adc0d1a9370b45ed141: Status 404 returned error can't find the container with id 417df0b0ee62d38a991080337e93689dadaaab8975e81adc0d1a9370b45ed141 Oct 10 08:22:29 crc kubenswrapper[4732]: I1010 08:22:29.555350 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:22:29 crc kubenswrapper[4732]: I1010 08:22:29.555929 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api-log" containerID="cri-o://a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703" gracePeriod=30 Oct 10 08:22:29 crc kubenswrapper[4732]: I1010 08:22:29.557761 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api" containerID="cri-o://dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68" gracePeriod=30 Oct 10 08:22:29 crc kubenswrapper[4732]: I1010 08:22:29.638003 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"204a7c7c-d10f-4402-afa2-59dc26b84240","Type":"ContainerStarted","Data":"417df0b0ee62d38a991080337e93689dadaaab8975e81adc0d1a9370b45ed141"} Oct 10 08:22:30 crc kubenswrapper[4732]: I1010 08:22:30.650623 4732 generic.go:334] "Generic (PLEG): container finished" podID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerID="a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703" exitCode=143 Oct 10 08:22:30 crc kubenswrapper[4732]: I1010 08:22:30.651138 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf","Type":"ContainerDied","Data":"a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703"} Oct 10 08:22:30 crc kubenswrapper[4732]: I1010 08:22:30.654508 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"204a7c7c-d10f-4402-afa2-59dc26b84240","Type":"ContainerStarted","Data":"1d20cb0823d5c1581f8c98ef188a326297ea8f79014cdeae11dc7dc8690134c0"} Oct 10 08:22:30 crc kubenswrapper[4732]: I1010 08:22:30.654538 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"204a7c7c-d10f-4402-afa2-59dc26b84240","Type":"ContainerStarted","Data":"848176b4f486223709246b715fc88edd7abced4dfa2b93f698fe1ca8d4bf91c4"} Oct 10 08:22:30 crc kubenswrapper[4732]: I1010 08:22:30.691470 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.417429046 podStartE2EDuration="2.691448088s" podCreationTimestamp="2025-10-10 08:22:28 +0000 UTC" firstStartedPulling="2025-10-10 08:22:29.021023096 +0000 UTC m=+5476.090614337" lastFinishedPulling="2025-10-10 08:22:29.295042138 +0000 UTC m=+5476.364633379" observedRunningTime="2025-10-10 08:22:30.681341745 +0000 UTC m=+5477.750933026" watchObservedRunningTime="2025-10-10 08:22:30.691448088 +0000 UTC m=+5477.761039349" Oct 10 08:22:32 crc kubenswrapper[4732]: I1010 08:22:32.713356 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.52:8776/healthcheck\": read tcp 10.217.0.2:60106->10.217.1.52:8776: read: connection reset by peer" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.158218 4732 scope.go:117] "RemoveContainer" containerID="a80153ef60ebeca56a582cd838e49cc895bce0cff31faf651b12bea8db2e1f0d" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.181810 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275009 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data-custom\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275073 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfv5r\" (UniqueName: \"kubernetes.io/projected/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-kube-api-access-wfv5r\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275249 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-public-tls-certs\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275285 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-etc-machine-id\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275354 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-logs\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275405 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-internal-tls-certs\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275443 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-scripts\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.275464 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-combined-ca-bundle\") pod \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\" (UID: \"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf\") " Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.277730 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.278392 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-logs" (OuterVolumeSpecName: "logs") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.281207 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-scripts" (OuterVolumeSpecName: "scripts") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.283663 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.293006 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-kube-api-access-wfv5r" (OuterVolumeSpecName: "kube-api-access-wfv5r") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "kube-api-access-wfv5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.320228 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.352185 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.356029 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data" (OuterVolumeSpecName: "config-data") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.368212 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" (UID: "85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378589 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378632 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378647 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378657 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378669 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378680 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfv5r\" (UniqueName: \"kubernetes.io/projected/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-kube-api-access-wfv5r\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378710 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378722 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.378732 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.542144 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.707477 4732 generic.go:334] "Generic (PLEG): container finished" podID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerID="dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68" exitCode=0 Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.707524 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf","Type":"ContainerDied","Data":"dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68"} Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.707551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf","Type":"ContainerDied","Data":"0465b4761ff491facfcdece095c0e8f8b980fd9b4d6a7907c39a7c2373480d05"} Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.707570 4732 scope.go:117] "RemoveContainer" containerID="dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.707732 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.754511 4732 scope.go:117] "RemoveContainer" containerID="a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.755363 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.768136 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.776146 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:22:33 crc kubenswrapper[4732]: E1010 08:22:33.776685 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api-log" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.776730 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api-log" Oct 10 08:22:33 crc kubenswrapper[4732]: E1010 08:22:33.776754 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.776769 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.777136 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api-log" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.777181 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" containerName="cinder-api" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.778825 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.781474 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.781860 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.782122 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.784736 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.789955 4732 scope.go:117] "RemoveContainer" containerID="dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68" Oct 10 08:22:33 crc kubenswrapper[4732]: E1010 08:22:33.790409 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68\": container with ID starting with dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68 not found: ID does not exist" containerID="dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.790523 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68"} err="failed to get container status \"dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68\": rpc error: code = NotFound desc = could not find container \"dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68\": container with ID starting with dd2649c7b843f11c65882cbfce558849f24e3bbe1741bfe5f38e0504338f9d68 not found: ID does not exist" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.790619 4732 scope.go:117] "RemoveContainer" containerID="a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703" Oct 10 08:22:33 crc kubenswrapper[4732]: E1010 08:22:33.791060 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703\": container with ID starting with a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703 not found: ID does not exist" containerID="a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.791159 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703"} err="failed to get container status \"a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703\": rpc error: code = NotFound desc = could not find container \"a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703\": container with ID starting with a9e6dd23d337fd36533f97d6f57740047e09c0ad6878089937d31c38858b0703 not found: ID does not exist" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.894806 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-scripts\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.894867 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/917ad3d8-97f9-4994-9f75-a6f9307137c9-logs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.894898 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.894985 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/917ad3d8-97f9-4994-9f75-a6f9307137c9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.895007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.895101 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv455\" (UniqueName: \"kubernetes.io/projected/917ad3d8-97f9-4994-9f75-a6f9307137c9-kube-api-access-hv455\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.895138 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-config-data-custom\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.895199 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-config-data\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.895241 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.997148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-scripts\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.997224 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/917ad3d8-97f9-4994-9f75-a6f9307137c9-logs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.997855 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/917ad3d8-97f9-4994-9f75-a6f9307137c9-logs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.998515 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.998673 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/917ad3d8-97f9-4994-9f75-a6f9307137c9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.998805 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.998765 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/917ad3d8-97f9-4994-9f75-a6f9307137c9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.999552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv455\" (UniqueName: \"kubernetes.io/projected/917ad3d8-97f9-4994-9f75-a6f9307137c9-kube-api-access-hv455\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.999613 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-config-data-custom\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.999810 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-config-data\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:33 crc kubenswrapper[4732]: I1010 08:22:33.999886 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.003650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.004294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-config-data-custom\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.004451 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.005062 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.016138 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-scripts\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.017854 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/917ad3d8-97f9-4994-9f75-a6f9307137c9-config-data\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.032261 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv455\" (UniqueName: \"kubernetes.io/projected/917ad3d8-97f9-4994-9f75-a6f9307137c9-kube-api-access-hv455\") pod \"cinder-api-0\" (UID: \"917ad3d8-97f9-4994-9f75-a6f9307137c9\") " pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.152288 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.682659 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 10 08:22:34 crc kubenswrapper[4732]: W1010 08:22:34.689290 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod917ad3d8_97f9_4994_9f75_a6f9307137c9.slice/crio-214267c0da9d3dbab4c7a56fe94e95e194bde16634e83d3212fcb996113069b3 WatchSource:0}: Error finding container 214267c0da9d3dbab4c7a56fe94e95e194bde16634e83d3212fcb996113069b3: Status 404 returned error can't find the container with id 214267c0da9d3dbab4c7a56fe94e95e194bde16634e83d3212fcb996113069b3 Oct 10 08:22:34 crc kubenswrapper[4732]: I1010 08:22:34.717287 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"917ad3d8-97f9-4994-9f75-a6f9307137c9","Type":"ContainerStarted","Data":"214267c0da9d3dbab4c7a56fe94e95e194bde16634e83d3212fcb996113069b3"} Oct 10 08:22:35 crc kubenswrapper[4732]: I1010 08:22:35.674627 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf" path="/var/lib/kubelet/pods/85c3a2a2-fe9a-40d4-9356-4a6c7d7e5baf/volumes" Oct 10 08:22:35 crc kubenswrapper[4732]: I1010 08:22:35.728036 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"917ad3d8-97f9-4994-9f75-a6f9307137c9","Type":"ContainerStarted","Data":"6cf5fabd1a5f7e7c8f2e48f9a0cf5a67d08ddb22b81d29f6ecc9b7df64a016a2"} Oct 10 08:22:36 crc kubenswrapper[4732]: I1010 08:22:36.740403 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"917ad3d8-97f9-4994-9f75-a6f9307137c9","Type":"ContainerStarted","Data":"7a290d5c7e2f84bfd534eecda97c2076b39e075976e00625088af5063311137a"} Oct 10 08:22:36 crc kubenswrapper[4732]: I1010 08:22:36.740929 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 10 08:22:36 crc kubenswrapper[4732]: I1010 08:22:36.778644 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.778622671 podStartE2EDuration="3.778622671s" podCreationTimestamp="2025-10-10 08:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:22:36.775488706 +0000 UTC m=+5483.845079987" watchObservedRunningTime="2025-10-10 08:22:36.778622671 +0000 UTC m=+5483.848213922" Oct 10 08:22:38 crc kubenswrapper[4732]: I1010 08:22:38.737130 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 08:22:38 crc kubenswrapper[4732]: I1010 08:22:38.816790 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:38 crc kubenswrapper[4732]: I1010 08:22:38.817037 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="cinder-scheduler" containerID="cri-o://848176b4f486223709246b715fc88edd7abced4dfa2b93f698fe1ca8d4bf91c4" gracePeriod=30 Oct 10 08:22:38 crc kubenswrapper[4732]: I1010 08:22:38.817197 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="probe" containerID="cri-o://1d20cb0823d5c1581f8c98ef188a326297ea8f79014cdeae11dc7dc8690134c0" gracePeriod=30 Oct 10 08:22:39 crc kubenswrapper[4732]: I1010 08:22:39.768238 4732 generic.go:334] "Generic (PLEG): container finished" podID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerID="1d20cb0823d5c1581f8c98ef188a326297ea8f79014cdeae11dc7dc8690134c0" exitCode=0 Oct 10 08:22:39 crc kubenswrapper[4732]: I1010 08:22:39.768293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"204a7c7c-d10f-4402-afa2-59dc26b84240","Type":"ContainerDied","Data":"1d20cb0823d5c1581f8c98ef188a326297ea8f79014cdeae11dc7dc8690134c0"} Oct 10 08:22:40 crc kubenswrapper[4732]: E1010 08:22:40.701458 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204a7c7c_d10f_4402_afa2_59dc26b84240.slice/crio-848176b4f486223709246b715fc88edd7abced4dfa2b93f698fe1ca8d4bf91c4.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:22:40 crc kubenswrapper[4732]: I1010 08:22:40.792351 4732 generic.go:334] "Generic (PLEG): container finished" podID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerID="848176b4f486223709246b715fc88edd7abced4dfa2b93f698fe1ca8d4bf91c4" exitCode=0 Oct 10 08:22:40 crc kubenswrapper[4732]: I1010 08:22:40.792403 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"204a7c7c-d10f-4402-afa2-59dc26b84240","Type":"ContainerDied","Data":"848176b4f486223709246b715fc88edd7abced4dfa2b93f698fe1ca8d4bf91c4"} Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.082642 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.149022 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data\") pod \"204a7c7c-d10f-4402-afa2-59dc26b84240\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.149124 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-combined-ca-bundle\") pod \"204a7c7c-d10f-4402-afa2-59dc26b84240\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.149239 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n8jf\" (UniqueName: \"kubernetes.io/projected/204a7c7c-d10f-4402-afa2-59dc26b84240-kube-api-access-4n8jf\") pod \"204a7c7c-d10f-4402-afa2-59dc26b84240\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.149336 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data-custom\") pod \"204a7c7c-d10f-4402-afa2-59dc26b84240\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.149517 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-scripts\") pod \"204a7c7c-d10f-4402-afa2-59dc26b84240\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.149536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204a7c7c-d10f-4402-afa2-59dc26b84240-etc-machine-id\") pod \"204a7c7c-d10f-4402-afa2-59dc26b84240\" (UID: \"204a7c7c-d10f-4402-afa2-59dc26b84240\") " Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.150120 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/204a7c7c-d10f-4402-afa2-59dc26b84240-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "204a7c7c-d10f-4402-afa2-59dc26b84240" (UID: "204a7c7c-d10f-4402-afa2-59dc26b84240"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.157943 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-scripts" (OuterVolumeSpecName: "scripts") pod "204a7c7c-d10f-4402-afa2-59dc26b84240" (UID: "204a7c7c-d10f-4402-afa2-59dc26b84240"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.158374 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204a7c7c-d10f-4402-afa2-59dc26b84240-kube-api-access-4n8jf" (OuterVolumeSpecName: "kube-api-access-4n8jf") pod "204a7c7c-d10f-4402-afa2-59dc26b84240" (UID: "204a7c7c-d10f-4402-afa2-59dc26b84240"). InnerVolumeSpecName "kube-api-access-4n8jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.162500 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "204a7c7c-d10f-4402-afa2-59dc26b84240" (UID: "204a7c7c-d10f-4402-afa2-59dc26b84240"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.214384 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "204a7c7c-d10f-4402-afa2-59dc26b84240" (UID: "204a7c7c-d10f-4402-afa2-59dc26b84240"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.252007 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.252041 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n8jf\" (UniqueName: \"kubernetes.io/projected/204a7c7c-d10f-4402-afa2-59dc26b84240-kube-api-access-4n8jf\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.252054 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.252063 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.252074 4732 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/204a7c7c-d10f-4402-afa2-59dc26b84240-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.278557 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data" (OuterVolumeSpecName: "config-data") pod "204a7c7c-d10f-4402-afa2-59dc26b84240" (UID: "204a7c7c-d10f-4402-afa2-59dc26b84240"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.354264 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/204a7c7c-d10f-4402-afa2-59dc26b84240-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.802511 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"204a7c7c-d10f-4402-afa2-59dc26b84240","Type":"ContainerDied","Data":"417df0b0ee62d38a991080337e93689dadaaab8975e81adc0d1a9370b45ed141"} Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.802786 4732 scope.go:117] "RemoveContainer" containerID="1d20cb0823d5c1581f8c98ef188a326297ea8f79014cdeae11dc7dc8690134c0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.802845 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.835047 4732 scope.go:117] "RemoveContainer" containerID="848176b4f486223709246b715fc88edd7abced4dfa2b93f698fe1ca8d4bf91c4" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.843073 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.851478 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.874337 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:41 crc kubenswrapper[4732]: E1010 08:22:41.874815 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="cinder-scheduler" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.874834 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="cinder-scheduler" Oct 10 08:22:41 crc kubenswrapper[4732]: E1010 08:22:41.874851 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="probe" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.874858 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="probe" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.875010 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="cinder-scheduler" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.875024 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" containerName="probe" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.875969 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.878658 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.889857 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.965314 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.965398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.965473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.965572 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrtqz\" (UniqueName: \"kubernetes.io/projected/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-kube-api-access-lrtqz\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.965609 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:41 crc kubenswrapper[4732]: I1010 08:22:41.965772 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.068138 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.068389 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.068516 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.068916 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.069030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrtqz\" (UniqueName: \"kubernetes.io/projected/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-kube-api-access-lrtqz\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.069106 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.069327 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.075211 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.075385 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.076643 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.084844 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.087229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrtqz\" (UniqueName: \"kubernetes.io/projected/e3993a84-cfe6-47e0-9f72-5d56aa71cdba-kube-api-access-lrtqz\") pod \"cinder-scheduler-0\" (UID: \"e3993a84-cfe6-47e0-9f72-5d56aa71cdba\") " pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.206356 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.689021 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 10 08:22:42 crc kubenswrapper[4732]: I1010 08:22:42.818012 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3993a84-cfe6-47e0-9f72-5d56aa71cdba","Type":"ContainerStarted","Data":"7db9d4b7780e37d46fb01868d65e16e9180f71e18ec1a6b8610d5cee1e2ac1ee"} Oct 10 08:22:43 crc kubenswrapper[4732]: I1010 08:22:43.681059 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204a7c7c-d10f-4402-afa2-59dc26b84240" path="/var/lib/kubelet/pods/204a7c7c-d10f-4402-afa2-59dc26b84240/volumes" Oct 10 08:22:43 crc kubenswrapper[4732]: I1010 08:22:43.833434 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3993a84-cfe6-47e0-9f72-5d56aa71cdba","Type":"ContainerStarted","Data":"7680b5fd657262f427652fff90b5ace949a81f9dd465b39ae682d8a9756ba2a0"} Oct 10 08:22:44 crc kubenswrapper[4732]: I1010 08:22:44.850413 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3993a84-cfe6-47e0-9f72-5d56aa71cdba","Type":"ContainerStarted","Data":"47b2276e8c3819bcfbdd86ce6b9b9a43056ac9341a006b5eb285b4360147ef2b"} Oct 10 08:22:45 crc kubenswrapper[4732]: I1010 08:22:45.853033 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 10 08:22:45 crc kubenswrapper[4732]: I1010 08:22:45.883137 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.883104459 podStartE2EDuration="4.883104459s" podCreationTimestamp="2025-10-10 08:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:22:44.881445237 +0000 UTC m=+5491.951036488" watchObservedRunningTime="2025-10-10 08:22:45.883104459 +0000 UTC m=+5492.952695740" Oct 10 08:22:47 crc kubenswrapper[4732]: I1010 08:22:47.207454 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 10 08:22:52 crc kubenswrapper[4732]: I1010 08:22:52.451088 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 10 08:22:53 crc kubenswrapper[4732]: I1010 08:22:53.481848 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-j7p9g"] Oct 10 08:22:53 crc kubenswrapper[4732]: I1010 08:22:53.483720 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j7p9g" Oct 10 08:22:53 crc kubenswrapper[4732]: I1010 08:22:53.495662 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-j7p9g"] Oct 10 08:22:53 crc kubenswrapper[4732]: I1010 08:22:53.605227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq6fc\" (UniqueName: \"kubernetes.io/projected/9208efd3-71f4-4b1d-9973-c27759728a30-kube-api-access-wq6fc\") pod \"glance-db-create-j7p9g\" (UID: \"9208efd3-71f4-4b1d-9973-c27759728a30\") " pod="openstack/glance-db-create-j7p9g" Oct 10 08:22:53 crc kubenswrapper[4732]: I1010 08:22:53.707253 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq6fc\" (UniqueName: \"kubernetes.io/projected/9208efd3-71f4-4b1d-9973-c27759728a30-kube-api-access-wq6fc\") pod \"glance-db-create-j7p9g\" (UID: \"9208efd3-71f4-4b1d-9973-c27759728a30\") " pod="openstack/glance-db-create-j7p9g" Oct 10 08:22:53 crc kubenswrapper[4732]: I1010 08:22:53.733464 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq6fc\" (UniqueName: \"kubernetes.io/projected/9208efd3-71f4-4b1d-9973-c27759728a30-kube-api-access-wq6fc\") pod \"glance-db-create-j7p9g\" (UID: \"9208efd3-71f4-4b1d-9973-c27759728a30\") " pod="openstack/glance-db-create-j7p9g" Oct 10 08:22:53 crc kubenswrapper[4732]: I1010 08:22:53.799192 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j7p9g" Oct 10 08:22:54 crc kubenswrapper[4732]: I1010 08:22:54.311180 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-j7p9g"] Oct 10 08:22:54 crc kubenswrapper[4732]: I1010 08:22:54.961090 4732 generic.go:334] "Generic (PLEG): container finished" podID="9208efd3-71f4-4b1d-9973-c27759728a30" containerID="22194aef04bf1366955ade452fe2be591b4cdbf3cb21d9d2f2f05f37e156282e" exitCode=0 Oct 10 08:22:54 crc kubenswrapper[4732]: I1010 08:22:54.961175 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j7p9g" event={"ID":"9208efd3-71f4-4b1d-9973-c27759728a30","Type":"ContainerDied","Data":"22194aef04bf1366955ade452fe2be591b4cdbf3cb21d9d2f2f05f37e156282e"} Oct 10 08:22:54 crc kubenswrapper[4732]: I1010 08:22:54.961396 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j7p9g" event={"ID":"9208efd3-71f4-4b1d-9973-c27759728a30","Type":"ContainerStarted","Data":"58e639131874a02a676de2cdd9a6de4212d1308e8001dee9200e25dca925508e"} Oct 10 08:22:56 crc kubenswrapper[4732]: I1010 08:22:56.319019 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j7p9g" Oct 10 08:22:56 crc kubenswrapper[4732]: I1010 08:22:56.458942 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq6fc\" (UniqueName: \"kubernetes.io/projected/9208efd3-71f4-4b1d-9973-c27759728a30-kube-api-access-wq6fc\") pod \"9208efd3-71f4-4b1d-9973-c27759728a30\" (UID: \"9208efd3-71f4-4b1d-9973-c27759728a30\") " Oct 10 08:22:56 crc kubenswrapper[4732]: I1010 08:22:56.467678 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9208efd3-71f4-4b1d-9973-c27759728a30-kube-api-access-wq6fc" (OuterVolumeSpecName: "kube-api-access-wq6fc") pod "9208efd3-71f4-4b1d-9973-c27759728a30" (UID: "9208efd3-71f4-4b1d-9973-c27759728a30"). InnerVolumeSpecName "kube-api-access-wq6fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:22:56 crc kubenswrapper[4732]: I1010 08:22:56.561346 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq6fc\" (UniqueName: \"kubernetes.io/projected/9208efd3-71f4-4b1d-9973-c27759728a30-kube-api-access-wq6fc\") on node \"crc\" DevicePath \"\"" Oct 10 08:22:56 crc kubenswrapper[4732]: I1010 08:22:56.987346 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-j7p9g" event={"ID":"9208efd3-71f4-4b1d-9973-c27759728a30","Type":"ContainerDied","Data":"58e639131874a02a676de2cdd9a6de4212d1308e8001dee9200e25dca925508e"} Oct 10 08:22:56 crc kubenswrapper[4732]: I1010 08:22:56.987863 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58e639131874a02a676de2cdd9a6de4212d1308e8001dee9200e25dca925508e" Oct 10 08:22:56 crc kubenswrapper[4732]: I1010 08:22:56.987426 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-j7p9g" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.592155 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-784f-account-create-z8p4z"] Oct 10 08:23:03 crc kubenswrapper[4732]: E1010 08:23:03.593181 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9208efd3-71f4-4b1d-9973-c27759728a30" containerName="mariadb-database-create" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.593200 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9208efd3-71f4-4b1d-9973-c27759728a30" containerName="mariadb-database-create" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.593396 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9208efd3-71f4-4b1d-9973-c27759728a30" containerName="mariadb-database-create" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.594142 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-784f-account-create-z8p4z" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.597591 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.605021 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-784f-account-create-z8p4z"] Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.731898 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfq5\" (UniqueName: \"kubernetes.io/projected/85dab8cc-da56-49c1-af51-eb0b2a24f3a9-kube-api-access-9dfq5\") pod \"glance-784f-account-create-z8p4z\" (UID: \"85dab8cc-da56-49c1-af51-eb0b2a24f3a9\") " pod="openstack/glance-784f-account-create-z8p4z" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.833898 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfq5\" (UniqueName: \"kubernetes.io/projected/85dab8cc-da56-49c1-af51-eb0b2a24f3a9-kube-api-access-9dfq5\") pod \"glance-784f-account-create-z8p4z\" (UID: \"85dab8cc-da56-49c1-af51-eb0b2a24f3a9\") " pod="openstack/glance-784f-account-create-z8p4z" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.857358 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfq5\" (UniqueName: \"kubernetes.io/projected/85dab8cc-da56-49c1-af51-eb0b2a24f3a9-kube-api-access-9dfq5\") pod \"glance-784f-account-create-z8p4z\" (UID: \"85dab8cc-da56-49c1-af51-eb0b2a24f3a9\") " pod="openstack/glance-784f-account-create-z8p4z" Oct 10 08:23:03 crc kubenswrapper[4732]: I1010 08:23:03.915676 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-784f-account-create-z8p4z" Oct 10 08:23:04 crc kubenswrapper[4732]: I1010 08:23:04.470463 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-784f-account-create-z8p4z"] Oct 10 08:23:04 crc kubenswrapper[4732]: W1010 08:23:04.471414 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85dab8cc_da56_49c1_af51_eb0b2a24f3a9.slice/crio-c4d496cbca7cb6ab4945a0cd66b080fff7558ff66930809e6e0e77263f9d3153 WatchSource:0}: Error finding container c4d496cbca7cb6ab4945a0cd66b080fff7558ff66930809e6e0e77263f9d3153: Status 404 returned error can't find the container with id c4d496cbca7cb6ab4945a0cd66b080fff7558ff66930809e6e0e77263f9d3153 Oct 10 08:23:05 crc kubenswrapper[4732]: I1010 08:23:05.088452 4732 generic.go:334] "Generic (PLEG): container finished" podID="85dab8cc-da56-49c1-af51-eb0b2a24f3a9" containerID="4c4823b6931c0d2801aa4703bd727a3689261d41b68e4b197c5e33dbfbf0b3fd" exitCode=0 Oct 10 08:23:05 crc kubenswrapper[4732]: I1010 08:23:05.088562 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-784f-account-create-z8p4z" event={"ID":"85dab8cc-da56-49c1-af51-eb0b2a24f3a9","Type":"ContainerDied","Data":"4c4823b6931c0d2801aa4703bd727a3689261d41b68e4b197c5e33dbfbf0b3fd"} Oct 10 08:23:05 crc kubenswrapper[4732]: I1010 08:23:05.088864 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-784f-account-create-z8p4z" event={"ID":"85dab8cc-da56-49c1-af51-eb0b2a24f3a9","Type":"ContainerStarted","Data":"c4d496cbca7cb6ab4945a0cd66b080fff7558ff66930809e6e0e77263f9d3153"} Oct 10 08:23:06 crc kubenswrapper[4732]: I1010 08:23:06.468496 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-784f-account-create-z8p4z" Oct 10 08:23:06 crc kubenswrapper[4732]: I1010 08:23:06.592923 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfq5\" (UniqueName: \"kubernetes.io/projected/85dab8cc-da56-49c1-af51-eb0b2a24f3a9-kube-api-access-9dfq5\") pod \"85dab8cc-da56-49c1-af51-eb0b2a24f3a9\" (UID: \"85dab8cc-da56-49c1-af51-eb0b2a24f3a9\") " Oct 10 08:23:06 crc kubenswrapper[4732]: I1010 08:23:06.600003 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85dab8cc-da56-49c1-af51-eb0b2a24f3a9-kube-api-access-9dfq5" (OuterVolumeSpecName: "kube-api-access-9dfq5") pod "85dab8cc-da56-49c1-af51-eb0b2a24f3a9" (UID: "85dab8cc-da56-49c1-af51-eb0b2a24f3a9"). InnerVolumeSpecName "kube-api-access-9dfq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:23:06 crc kubenswrapper[4732]: I1010 08:23:06.695977 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfq5\" (UniqueName: \"kubernetes.io/projected/85dab8cc-da56-49c1-af51-eb0b2a24f3a9-kube-api-access-9dfq5\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:07 crc kubenswrapper[4732]: I1010 08:23:07.106968 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-784f-account-create-z8p4z" event={"ID":"85dab8cc-da56-49c1-af51-eb0b2a24f3a9","Type":"ContainerDied","Data":"c4d496cbca7cb6ab4945a0cd66b080fff7558ff66930809e6e0e77263f9d3153"} Oct 10 08:23:07 crc kubenswrapper[4732]: I1010 08:23:07.107016 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-784f-account-create-z8p4z" Oct 10 08:23:07 crc kubenswrapper[4732]: I1010 08:23:07.107026 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4d496cbca7cb6ab4945a0cd66b080fff7558ff66930809e6e0e77263f9d3153" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.644420 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zthxv"] Oct 10 08:23:08 crc kubenswrapper[4732]: E1010 08:23:08.645144 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85dab8cc-da56-49c1-af51-eb0b2a24f3a9" containerName="mariadb-account-create" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.645161 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="85dab8cc-da56-49c1-af51-eb0b2a24f3a9" containerName="mariadb-account-create" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.645362 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="85dab8cc-da56-49c1-af51-eb0b2a24f3a9" containerName="mariadb-account-create" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.646117 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.647826 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kbqcw" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.648227 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.663194 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zthxv"] Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.731925 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kqx7\" (UniqueName: \"kubernetes.io/projected/17871c51-5f9e-46b9-975e-8fd40a25c9df-kube-api-access-9kqx7\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.731989 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-config-data\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.732150 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-db-sync-config-data\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.732440 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-combined-ca-bundle\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.833946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kqx7\" (UniqueName: \"kubernetes.io/projected/17871c51-5f9e-46b9-975e-8fd40a25c9df-kube-api-access-9kqx7\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.834211 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-config-data\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.834379 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-db-sync-config-data\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.834565 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-combined-ca-bundle\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.843343 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-db-sync-config-data\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.843425 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-config-data\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.843784 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-combined-ca-bundle\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.858418 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kqx7\" (UniqueName: \"kubernetes.io/projected/17871c51-5f9e-46b9-975e-8fd40a25c9df-kube-api-access-9kqx7\") pod \"glance-db-sync-zthxv\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:08 crc kubenswrapper[4732]: I1010 08:23:08.968646 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:09 crc kubenswrapper[4732]: I1010 08:23:09.634583 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zthxv"] Oct 10 08:23:10 crc kubenswrapper[4732]: I1010 08:23:10.141218 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zthxv" event={"ID":"17871c51-5f9e-46b9-975e-8fd40a25c9df","Type":"ContainerStarted","Data":"c87ace49b2a4715125fe4700d47eabd96a6c73c18d16dd33049cdec342d2c266"} Oct 10 08:23:26 crc kubenswrapper[4732]: I1010 08:23:26.293119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zthxv" event={"ID":"17871c51-5f9e-46b9-975e-8fd40a25c9df","Type":"ContainerStarted","Data":"0ab9c90065762eb79824bd5b2850939f2fa691e047b43cf96227aee6c59de592"} Oct 10 08:23:26 crc kubenswrapper[4732]: I1010 08:23:26.335348 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zthxv" podStartSLOduration=3.064663546 podStartE2EDuration="18.335321886s" podCreationTimestamp="2025-10-10 08:23:08 +0000 UTC" firstStartedPulling="2025-10-10 08:23:09.644677389 +0000 UTC m=+5516.714268620" lastFinishedPulling="2025-10-10 08:23:24.915335719 +0000 UTC m=+5531.984926960" observedRunningTime="2025-10-10 08:23:26.328759829 +0000 UTC m=+5533.398351150" watchObservedRunningTime="2025-10-10 08:23:26.335321886 +0000 UTC m=+5533.404913167" Oct 10 08:23:29 crc kubenswrapper[4732]: I1010 08:23:29.321383 4732 generic.go:334] "Generic (PLEG): container finished" podID="17871c51-5f9e-46b9-975e-8fd40a25c9df" containerID="0ab9c90065762eb79824bd5b2850939f2fa691e047b43cf96227aee6c59de592" exitCode=0 Oct 10 08:23:29 crc kubenswrapper[4732]: I1010 08:23:29.321486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zthxv" event={"ID":"17871c51-5f9e-46b9-975e-8fd40a25c9df","Type":"ContainerDied","Data":"0ab9c90065762eb79824bd5b2850939f2fa691e047b43cf96227aee6c59de592"} Oct 10 08:23:30 crc kubenswrapper[4732]: I1010 08:23:30.831680 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:30 crc kubenswrapper[4732]: I1010 08:23:30.965584 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kqx7\" (UniqueName: \"kubernetes.io/projected/17871c51-5f9e-46b9-975e-8fd40a25c9df-kube-api-access-9kqx7\") pod \"17871c51-5f9e-46b9-975e-8fd40a25c9df\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " Oct 10 08:23:30 crc kubenswrapper[4732]: I1010 08:23:30.965933 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-db-sync-config-data\") pod \"17871c51-5f9e-46b9-975e-8fd40a25c9df\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " Oct 10 08:23:30 crc kubenswrapper[4732]: I1010 08:23:30.966013 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-combined-ca-bundle\") pod \"17871c51-5f9e-46b9-975e-8fd40a25c9df\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " Oct 10 08:23:30 crc kubenswrapper[4732]: I1010 08:23:30.966051 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-config-data\") pod \"17871c51-5f9e-46b9-975e-8fd40a25c9df\" (UID: \"17871c51-5f9e-46b9-975e-8fd40a25c9df\") " Oct 10 08:23:30 crc kubenswrapper[4732]: I1010 08:23:30.972919 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17871c51-5f9e-46b9-975e-8fd40a25c9df-kube-api-access-9kqx7" (OuterVolumeSpecName: "kube-api-access-9kqx7") pod "17871c51-5f9e-46b9-975e-8fd40a25c9df" (UID: "17871c51-5f9e-46b9-975e-8fd40a25c9df"). InnerVolumeSpecName "kube-api-access-9kqx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:23:30 crc kubenswrapper[4732]: I1010 08:23:30.972963 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "17871c51-5f9e-46b9-975e-8fd40a25c9df" (UID: "17871c51-5f9e-46b9-975e-8fd40a25c9df"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.015942 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17871c51-5f9e-46b9-975e-8fd40a25c9df" (UID: "17871c51-5f9e-46b9-975e-8fd40a25c9df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.030733 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-config-data" (OuterVolumeSpecName: "config-data") pod "17871c51-5f9e-46b9-975e-8fd40a25c9df" (UID: "17871c51-5f9e-46b9-975e-8fd40a25c9df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.070019 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.070062 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.070078 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17871c51-5f9e-46b9-975e-8fd40a25c9df-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.070091 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kqx7\" (UniqueName: \"kubernetes.io/projected/17871c51-5f9e-46b9-975e-8fd40a25c9df-kube-api-access-9kqx7\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.342579 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zthxv" event={"ID":"17871c51-5f9e-46b9-975e-8fd40a25c9df","Type":"ContainerDied","Data":"c87ace49b2a4715125fe4700d47eabd96a6c73c18d16dd33049cdec342d2c266"} Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.342883 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c87ace49b2a4715125fe4700d47eabd96a6c73c18d16dd33049cdec342d2c266" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.342757 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zthxv" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.672978 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:31 crc kubenswrapper[4732]: E1010 08:23:31.673346 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17871c51-5f9e-46b9-975e-8fd40a25c9df" containerName="glance-db-sync" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.673368 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="17871c51-5f9e-46b9-975e-8fd40a25c9df" containerName="glance-db-sync" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.673606 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="17871c51-5f9e-46b9-975e-8fd40a25c9df" containerName="glance-db-sync" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.674759 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.676761 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kbqcw" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.677006 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.679786 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.693837 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.786779 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-config-data\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.786967 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-scripts\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.787018 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrqw\" (UniqueName: \"kubernetes.io/projected/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-kube-api-access-bcrqw\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.787055 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.787106 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.787180 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-logs\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.789933 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bd46fcbfc-k2sk6"] Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.791315 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.800467 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd46fcbfc-k2sk6"] Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.874308 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.875803 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.878316 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.882189 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888215 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-dns-svc\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-config\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888280 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-sb\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8fsc\" (UniqueName: \"kubernetes.io/projected/cc8b109a-f085-43d8-b43a-d4e196ce7dae-kube-api-access-l8fsc\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888339 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-scripts\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888368 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrqw\" (UniqueName: \"kubernetes.io/projected/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-kube-api-access-bcrqw\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888390 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888406 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-nb\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888432 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-logs\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.888507 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-config-data\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.890133 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.890948 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-logs\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.895012 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-scripts\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.910059 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.910096 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrqw\" (UniqueName: \"kubernetes.io/projected/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-kube-api-access-bcrqw\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.914742 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-config-data\") pod \"glance-default-external-api-0\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989596 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-dns-svc\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989655 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-config\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989675 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-sb\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989717 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8fsc\" (UniqueName: \"kubernetes.io/projected/cc8b109a-f085-43d8-b43a-d4e196ce7dae-kube-api-access-l8fsc\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989749 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989778 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989803 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-nb\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989847 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989871 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggjf\" (UniqueName: \"kubernetes.io/projected/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-kube-api-access-8ggjf\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.989901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.990446 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-config\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.990796 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-dns-svc\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.990835 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-nb\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.991174 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-sb\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:31 crc kubenswrapper[4732]: I1010 08:23:31.992078 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.009928 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8fsc\" (UniqueName: \"kubernetes.io/projected/cc8b109a-f085-43d8-b43a-d4e196ce7dae-kube-api-access-l8fsc\") pod \"dnsmasq-dns-bd46fcbfc-k2sk6\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.093151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.093214 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggjf\" (UniqueName: \"kubernetes.io/projected/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-kube-api-access-8ggjf\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.093248 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.093341 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.093365 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.093396 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.093945 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.094181 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.098474 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.099171 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.105424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.112711 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggjf\" (UniqueName: \"kubernetes.io/projected/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-kube-api-access-8ggjf\") pod \"glance-default-internal-api-0\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.119065 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.265339 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.519979 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.644520 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd46fcbfc-k2sk6"] Oct 10 08:23:32 crc kubenswrapper[4732]: W1010 08:23:32.648433 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc8b109a_f085_43d8_b43a_d4e196ce7dae.slice/crio-6da00f4fd9b5f509c075c73da7f31dfbfe3a81b1d67a613962ae89dc52a6fcf4 WatchSource:0}: Error finding container 6da00f4fd9b5f509c075c73da7f31dfbfe3a81b1d67a613962ae89dc52a6fcf4: Status 404 returned error can't find the container with id 6da00f4fd9b5f509c075c73da7f31dfbfe3a81b1d67a613962ae89dc52a6fcf4 Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.796117 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:32 crc kubenswrapper[4732]: I1010 08:23:32.842034 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:33 crc kubenswrapper[4732]: I1010 08:23:33.377571 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c","Type":"ContainerStarted","Data":"845b79e15bf555194836db839c006d429426636b695c71f13992fd4dfd94d160"} Oct 10 08:23:33 crc kubenswrapper[4732]: I1010 08:23:33.393534 4732 generic.go:334] "Generic (PLEG): container finished" podID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerID="0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450" exitCode=0 Oct 10 08:23:33 crc kubenswrapper[4732]: I1010 08:23:33.393632 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" event={"ID":"cc8b109a-f085-43d8-b43a-d4e196ce7dae","Type":"ContainerDied","Data":"0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450"} Oct 10 08:23:33 crc kubenswrapper[4732]: I1010 08:23:33.393660 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" event={"ID":"cc8b109a-f085-43d8-b43a-d4e196ce7dae","Type":"ContainerStarted","Data":"6da00f4fd9b5f509c075c73da7f31dfbfe3a81b1d67a613962ae89dc52a6fcf4"} Oct 10 08:23:33 crc kubenswrapper[4732]: I1010 08:23:33.398449 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34c54546-ee3b-4e83-ba04-2d34a91c5ccd","Type":"ContainerStarted","Data":"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e"} Oct 10 08:23:33 crc kubenswrapper[4732]: I1010 08:23:33.398515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34c54546-ee3b-4e83-ba04-2d34a91c5ccd","Type":"ContainerStarted","Data":"50739846c5b91fc58c517faf822f9c726b640fe36f02c4c7681743c27aa618c4"} Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.158335 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.408207 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c","Type":"ContainerStarted","Data":"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6"} Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.408258 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c","Type":"ContainerStarted","Data":"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1"} Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.409803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" event={"ID":"cc8b109a-f085-43d8-b43a-d4e196ce7dae","Type":"ContainerStarted","Data":"b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f"} Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.410476 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.414250 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34c54546-ee3b-4e83-ba04-2d34a91c5ccd","Type":"ContainerStarted","Data":"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f"} Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.414365 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-log" containerID="cri-o://6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e" gracePeriod=30 Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.414396 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-httpd" containerID="cri-o://b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f" gracePeriod=30 Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.435653 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.435633965 podStartE2EDuration="3.435633965s" podCreationTimestamp="2025-10-10 08:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:23:34.427907967 +0000 UTC m=+5541.497499228" watchObservedRunningTime="2025-10-10 08:23:34.435633965 +0000 UTC m=+5541.505225206" Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.451430 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.451414571 podStartE2EDuration="3.451414571s" podCreationTimestamp="2025-10-10 08:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:23:34.448969085 +0000 UTC m=+5541.518560336" watchObservedRunningTime="2025-10-10 08:23:34.451414571 +0000 UTC m=+5541.521005812" Oct 10 08:23:34 crc kubenswrapper[4732]: I1010 08:23:34.471217 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" podStartSLOduration=3.47067351 podStartE2EDuration="3.47067351s" podCreationTimestamp="2025-10-10 08:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:23:34.465171402 +0000 UTC m=+5541.534762663" watchObservedRunningTime="2025-10-10 08:23:34.47067351 +0000 UTC m=+5541.540264751" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.063134 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.156238 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcrqw\" (UniqueName: \"kubernetes.io/projected/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-kube-api-access-bcrqw\") pod \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.156317 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-scripts\") pod \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.156365 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-combined-ca-bundle\") pod \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.156395 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-httpd-run\") pod \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.156413 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-config-data\") pod \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.156476 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-logs\") pod \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\" (UID: \"34c54546-ee3b-4e83-ba04-2d34a91c5ccd\") " Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.157364 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-logs" (OuterVolumeSpecName: "logs") pod "34c54546-ee3b-4e83-ba04-2d34a91c5ccd" (UID: "34c54546-ee3b-4e83-ba04-2d34a91c5ccd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.157853 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34c54546-ee3b-4e83-ba04-2d34a91c5ccd" (UID: "34c54546-ee3b-4e83-ba04-2d34a91c5ccd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.163003 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-kube-api-access-bcrqw" (OuterVolumeSpecName: "kube-api-access-bcrqw") pod "34c54546-ee3b-4e83-ba04-2d34a91c5ccd" (UID: "34c54546-ee3b-4e83-ba04-2d34a91c5ccd"). InnerVolumeSpecName "kube-api-access-bcrqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.176063 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-scripts" (OuterVolumeSpecName: "scripts") pod "34c54546-ee3b-4e83-ba04-2d34a91c5ccd" (UID: "34c54546-ee3b-4e83-ba04-2d34a91c5ccd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.205151 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34c54546-ee3b-4e83-ba04-2d34a91c5ccd" (UID: "34c54546-ee3b-4e83-ba04-2d34a91c5ccd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.228950 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-config-data" (OuterVolumeSpecName: "config-data") pod "34c54546-ee3b-4e83-ba04-2d34a91c5ccd" (UID: "34c54546-ee3b-4e83-ba04-2d34a91c5ccd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.258746 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcrqw\" (UniqueName: \"kubernetes.io/projected/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-kube-api-access-bcrqw\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.258776 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.258787 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.258796 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.258807 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.258818 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34c54546-ee3b-4e83-ba04-2d34a91c5ccd-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.425985 4732 generic.go:334] "Generic (PLEG): container finished" podID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerID="b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f" exitCode=0 Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.426017 4732 generic.go:334] "Generic (PLEG): container finished" podID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerID="6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e" exitCode=143 Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.426043 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34c54546-ee3b-4e83-ba04-2d34a91c5ccd","Type":"ContainerDied","Data":"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f"} Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.426144 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34c54546-ee3b-4e83-ba04-2d34a91c5ccd","Type":"ContainerDied","Data":"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e"} Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.426177 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34c54546-ee3b-4e83-ba04-2d34a91c5ccd","Type":"ContainerDied","Data":"50739846c5b91fc58c517faf822f9c726b640fe36f02c4c7681743c27aa618c4"} Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.426203 4732 scope.go:117] "RemoveContainer" containerID="b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.426231 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-log" containerID="cri-o://9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1" gracePeriod=30 Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.426344 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-httpd" containerID="cri-o://53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6" gracePeriod=30 Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.427780 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.451391 4732 scope.go:117] "RemoveContainer" containerID="6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.469964 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.494469 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.499811 4732 scope.go:117] "RemoveContainer" containerID="b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f" Oct 10 08:23:35 crc kubenswrapper[4732]: E1010 08:23:35.501659 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f\": container with ID starting with b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f not found: ID does not exist" containerID="b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.501831 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f"} err="failed to get container status \"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f\": rpc error: code = NotFound desc = could not find container \"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f\": container with ID starting with b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f not found: ID does not exist" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.501944 4732 scope.go:117] "RemoveContainer" containerID="6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e" Oct 10 08:23:35 crc kubenswrapper[4732]: E1010 08:23:35.502434 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e\": container with ID starting with 6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e not found: ID does not exist" containerID="6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.502468 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e"} err="failed to get container status \"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e\": rpc error: code = NotFound desc = could not find container \"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e\": container with ID starting with 6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e not found: ID does not exist" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.502489 4732 scope.go:117] "RemoveContainer" containerID="b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.502761 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f"} err="failed to get container status \"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f\": rpc error: code = NotFound desc = could not find container \"b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f\": container with ID starting with b724f1a8b878159aad621ae0b903a7843794e62d9fbdfc06c6d2b2be5aed8b0f not found: ID does not exist" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.502781 4732 scope.go:117] "RemoveContainer" containerID="6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.503377 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e"} err="failed to get container status \"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e\": rpc error: code = NotFound desc = could not find container \"6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e\": container with ID starting with 6bdbbc00e0878b062a75bd7922806f8252426d771099b90bf18bb135654d0f6e not found: ID does not exist" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.512328 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:35 crc kubenswrapper[4732]: E1010 08:23:35.512771 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-httpd" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.512787 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-httpd" Oct 10 08:23:35 crc kubenswrapper[4732]: E1010 08:23:35.512808 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-log" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.512817 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-log" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.513037 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-log" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.513066 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" containerName="glance-httpd" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.514234 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.518356 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.518785 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.545764 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.562997 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-config-data\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.563050 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.563069 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-logs\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.563111 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-scripts\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.563134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmlc8\" (UniqueName: \"kubernetes.io/projected/66953b37-077e-49e6-905a-67c904325828-kube-api-access-hmlc8\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.563157 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.563227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.665192 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmlc8\" (UniqueName: \"kubernetes.io/projected/66953b37-077e-49e6-905a-67c904325828-kube-api-access-hmlc8\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.665240 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.665323 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.665374 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-config-data\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.665404 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.665430 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-logs\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.665478 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-scripts\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.666054 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.666105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-logs\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.669414 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-scripts\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.669464 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.669726 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-config-data\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.670265 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.672168 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c54546-ee3b-4e83-ba04-2d34a91c5ccd" path="/var/lib/kubelet/pods/34c54546-ee3b-4e83-ba04-2d34a91c5ccd/volumes" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.685099 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmlc8\" (UniqueName: \"kubernetes.io/projected/66953b37-077e-49e6-905a-67c904325828-kube-api-access-hmlc8\") pod \"glance-default-external-api-0\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " pod="openstack/glance-default-external-api-0" Oct 10 08:23:35 crc kubenswrapper[4732]: I1010 08:23:35.850918 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.059714 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.173225 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ggjf\" (UniqueName: \"kubernetes.io/projected/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-kube-api-access-8ggjf\") pod \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.173448 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-config-data\") pod \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.173502 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-logs\") pod \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.173613 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-combined-ca-bundle\") pod \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.173652 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-httpd-run\") pod \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.173721 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-scripts\") pod \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\" (UID: \"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c\") " Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.176071 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-logs" (OuterVolumeSpecName: "logs") pod "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" (UID: "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.176437 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" (UID: "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.181062 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-kube-api-access-8ggjf" (OuterVolumeSpecName: "kube-api-access-8ggjf") pod "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" (UID: "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c"). InnerVolumeSpecName "kube-api-access-8ggjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.185937 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-scripts" (OuterVolumeSpecName: "scripts") pod "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" (UID: "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.203434 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" (UID: "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.233569 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-config-data" (OuterVolumeSpecName: "config-data") pod "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" (UID: "5ec03f14-4bb8-463b-8f1d-cb3de7c3776c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.276229 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ggjf\" (UniqueName: \"kubernetes.io/projected/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-kube-api-access-8ggjf\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.276267 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.276277 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.276286 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.276294 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.276303 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.388987 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:23:36 crc kubenswrapper[4732]: W1010 08:23:36.402921 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66953b37_077e_49e6_905a_67c904325828.slice/crio-52feb45cf3468ee9c1146a019110d74496ce34550ab0567cda6c20bda1705964 WatchSource:0}: Error finding container 52feb45cf3468ee9c1146a019110d74496ce34550ab0567cda6c20bda1705964: Status 404 returned error can't find the container with id 52feb45cf3468ee9c1146a019110d74496ce34550ab0567cda6c20bda1705964 Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.449840 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66953b37-077e-49e6-905a-67c904325828","Type":"ContainerStarted","Data":"52feb45cf3468ee9c1146a019110d74496ce34550ab0567cda6c20bda1705964"} Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.454623 4732 generic.go:334] "Generic (PLEG): container finished" podID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerID="53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6" exitCode=0 Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.454671 4732 generic.go:334] "Generic (PLEG): container finished" podID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerID="9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1" exitCode=143 Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.455059 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.455736 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c","Type":"ContainerDied","Data":"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6"} Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.455776 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c","Type":"ContainerDied","Data":"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1"} Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.455791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5ec03f14-4bb8-463b-8f1d-cb3de7c3776c","Type":"ContainerDied","Data":"845b79e15bf555194836db839c006d429426636b695c71f13992fd4dfd94d160"} Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.455810 4732 scope.go:117] "RemoveContainer" containerID="53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.484666 4732 scope.go:117] "RemoveContainer" containerID="9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.514578 4732 scope.go:117] "RemoveContainer" containerID="53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.514681 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:36 crc kubenswrapper[4732]: E1010 08:23:36.517431 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6\": container with ID starting with 53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6 not found: ID does not exist" containerID="53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.517481 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6"} err="failed to get container status \"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6\": rpc error: code = NotFound desc = could not find container \"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6\": container with ID starting with 53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6 not found: ID does not exist" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.517508 4732 scope.go:117] "RemoveContainer" containerID="9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1" Oct 10 08:23:36 crc kubenswrapper[4732]: E1010 08:23:36.520732 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1\": container with ID starting with 9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1 not found: ID does not exist" containerID="9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.520790 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1"} err="failed to get container status \"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1\": rpc error: code = NotFound desc = could not find container \"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1\": container with ID starting with 9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1 not found: ID does not exist" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.520825 4732 scope.go:117] "RemoveContainer" containerID="53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.521790 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6"} err="failed to get container status \"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6\": rpc error: code = NotFound desc = could not find container \"53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6\": container with ID starting with 53041e4bba5ee81db98ac047990798d8c4d686c0464d6081cba0c72d986f28b6 not found: ID does not exist" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.521842 4732 scope.go:117] "RemoveContainer" containerID="9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.522057 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.522466 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1"} err="failed to get container status \"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1\": rpc error: code = NotFound desc = could not find container \"9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1\": container with ID starting with 9d4b5567ec592e8282897ab6214c3f0a380e03d739f54b8eb5d6c4457472e9b1 not found: ID does not exist" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.538841 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:36 crc kubenswrapper[4732]: E1010 08:23:36.539257 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-httpd" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.539270 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-httpd" Oct 10 08:23:36 crc kubenswrapper[4732]: E1010 08:23:36.539294 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-log" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.539300 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-log" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.539472 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-log" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.539494 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" containerName="glance-httpd" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.550468 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.550570 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.556669 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.556670 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.683556 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-logs\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.683643 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.683704 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.683773 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.683863 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.684018 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.684133 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4pm\" (UniqueName: \"kubernetes.io/projected/4c4190a9-ad8a-4274-8860-a527e05aa3f5-kube-api-access-dk4pm\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.786213 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4pm\" (UniqueName: \"kubernetes.io/projected/4c4190a9-ad8a-4274-8860-a527e05aa3f5-kube-api-access-dk4pm\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.786608 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-logs\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.786717 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.786824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.786940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.787045 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.787284 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.787968 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.788985 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-logs\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.802772 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.803546 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.805151 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.807112 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.833532 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4pm\" (UniqueName: \"kubernetes.io/projected/4c4190a9-ad8a-4274-8860-a527e05aa3f5-kube-api-access-dk4pm\") pod \"glance-default-internal-api-0\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:23:36 crc kubenswrapper[4732]: I1010 08:23:36.874213 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:37 crc kubenswrapper[4732]: I1010 08:23:37.393964 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:23:37 crc kubenswrapper[4732]: I1010 08:23:37.469391 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c4190a9-ad8a-4274-8860-a527e05aa3f5","Type":"ContainerStarted","Data":"a8835282a8857a058b41def75d56bb70bac3dc85ba7697f3d6f3a9c18852c48c"} Oct 10 08:23:37 crc kubenswrapper[4732]: I1010 08:23:37.472041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66953b37-077e-49e6-905a-67c904325828","Type":"ContainerStarted","Data":"cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f"} Oct 10 08:23:37 crc kubenswrapper[4732]: I1010 08:23:37.673174 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec03f14-4bb8-463b-8f1d-cb3de7c3776c" path="/var/lib/kubelet/pods/5ec03f14-4bb8-463b-8f1d-cb3de7c3776c/volumes" Oct 10 08:23:38 crc kubenswrapper[4732]: I1010 08:23:38.487921 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66953b37-077e-49e6-905a-67c904325828","Type":"ContainerStarted","Data":"7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d"} Oct 10 08:23:38 crc kubenswrapper[4732]: I1010 08:23:38.490393 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c4190a9-ad8a-4274-8860-a527e05aa3f5","Type":"ContainerStarted","Data":"a712184ddbc53f6abd6e6a0fb6febfb97002bf450ffb8ee97fe9774aae0ec1ed"} Oct 10 08:23:38 crc kubenswrapper[4732]: I1010 08:23:38.505593 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.505577898 podStartE2EDuration="3.505577898s" podCreationTimestamp="2025-10-10 08:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:23:38.504242052 +0000 UTC m=+5545.573833293" watchObservedRunningTime="2025-10-10 08:23:38.505577898 +0000 UTC m=+5545.575169139" Oct 10 08:23:38 crc kubenswrapper[4732]: I1010 08:23:38.528148 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.528128036 podStartE2EDuration="2.528128036s" podCreationTimestamp="2025-10-10 08:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:23:38.521852017 +0000 UTC m=+5545.591443268" watchObservedRunningTime="2025-10-10 08:23:38.528128036 +0000 UTC m=+5545.597719277" Oct 10 08:23:39 crc kubenswrapper[4732]: I1010 08:23:39.504564 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c4190a9-ad8a-4274-8860-a527e05aa3f5","Type":"ContainerStarted","Data":"9f828a17179e7faacae59ede9226a1ecb9fd4da98b97042352f4833a133dd673"} Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.120910 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.203151 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fbf48bcd7-b7vc5"] Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.204138 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" podUID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerName="dnsmasq-dns" containerID="cri-o://8102c3a5a8110272ae763a3f38ed415884058b0c397765ee61db62162e2ec321" gracePeriod=10 Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.543840 4732 generic.go:334] "Generic (PLEG): container finished" podID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerID="8102c3a5a8110272ae763a3f38ed415884058b0c397765ee61db62162e2ec321" exitCode=0 Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.544385 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" event={"ID":"64e501ef-9f03-4ca9-b9f7-425bb81a3435","Type":"ContainerDied","Data":"8102c3a5a8110272ae763a3f38ed415884058b0c397765ee61db62162e2ec321"} Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.670661 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.794621 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-nb\") pod \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.794815 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xppxz\" (UniqueName: \"kubernetes.io/projected/64e501ef-9f03-4ca9-b9f7-425bb81a3435-kube-api-access-xppxz\") pod \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.794857 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-sb\") pod \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.794945 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-dns-svc\") pod \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.794962 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-config\") pod \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\" (UID: \"64e501ef-9f03-4ca9-b9f7-425bb81a3435\") " Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.800351 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e501ef-9f03-4ca9-b9f7-425bb81a3435-kube-api-access-xppxz" (OuterVolumeSpecName: "kube-api-access-xppxz") pod "64e501ef-9f03-4ca9-b9f7-425bb81a3435" (UID: "64e501ef-9f03-4ca9-b9f7-425bb81a3435"). InnerVolumeSpecName "kube-api-access-xppxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.848779 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64e501ef-9f03-4ca9-b9f7-425bb81a3435" (UID: "64e501ef-9f03-4ca9-b9f7-425bb81a3435"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.854210 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-config" (OuterVolumeSpecName: "config") pod "64e501ef-9f03-4ca9-b9f7-425bb81a3435" (UID: "64e501ef-9f03-4ca9-b9f7-425bb81a3435"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.863253 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64e501ef-9f03-4ca9-b9f7-425bb81a3435" (UID: "64e501ef-9f03-4ca9-b9f7-425bb81a3435"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.870104 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64e501ef-9f03-4ca9-b9f7-425bb81a3435" (UID: "64e501ef-9f03-4ca9-b9f7-425bb81a3435"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.898294 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.898334 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xppxz\" (UniqueName: \"kubernetes.io/projected/64e501ef-9f03-4ca9-b9f7-425bb81a3435-kube-api-access-xppxz\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.898351 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.898364 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:42 crc kubenswrapper[4732]: I1010 08:23:42.898376 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e501ef-9f03-4ca9-b9f7-425bb81a3435-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:23:43 crc kubenswrapper[4732]: I1010 08:23:43.556754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" event={"ID":"64e501ef-9f03-4ca9-b9f7-425bb81a3435","Type":"ContainerDied","Data":"fffa92627b15b02f90a500e6ddc93521e7b739819176feffe282a5a5042811d9"} Oct 10 08:23:43 crc kubenswrapper[4732]: I1010 08:23:43.557065 4732 scope.go:117] "RemoveContainer" containerID="8102c3a5a8110272ae763a3f38ed415884058b0c397765ee61db62162e2ec321" Oct 10 08:23:43 crc kubenswrapper[4732]: I1010 08:23:43.556857 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fbf48bcd7-b7vc5" Oct 10 08:23:43 crc kubenswrapper[4732]: I1010 08:23:43.586562 4732 scope.go:117] "RemoveContainer" containerID="4ee06c0a96b42def01f7af723f735815bd9f8b3e9715b8944193e295421f02b8" Oct 10 08:23:43 crc kubenswrapper[4732]: I1010 08:23:43.608495 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fbf48bcd7-b7vc5"] Oct 10 08:23:43 crc kubenswrapper[4732]: I1010 08:23:43.615859 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fbf48bcd7-b7vc5"] Oct 10 08:23:43 crc kubenswrapper[4732]: I1010 08:23:43.672173 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" path="/var/lib/kubelet/pods/64e501ef-9f03-4ca9-b9f7-425bb81a3435/volumes" Oct 10 08:23:45 crc kubenswrapper[4732]: I1010 08:23:45.852055 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 08:23:45 crc kubenswrapper[4732]: I1010 08:23:45.852162 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 08:23:45 crc kubenswrapper[4732]: I1010 08:23:45.885234 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 08:23:45 crc kubenswrapper[4732]: I1010 08:23:45.896915 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 08:23:46 crc kubenswrapper[4732]: I1010 08:23:46.583213 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 08:23:46 crc kubenswrapper[4732]: I1010 08:23:46.583250 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 08:23:46 crc kubenswrapper[4732]: I1010 08:23:46.875004 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:46 crc kubenswrapper[4732]: I1010 08:23:46.875650 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:46 crc kubenswrapper[4732]: I1010 08:23:46.907160 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:46 crc kubenswrapper[4732]: I1010 08:23:46.919340 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.589170 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.589502 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.682664 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-869rb"] Oct 10 08:23:47 crc kubenswrapper[4732]: E1010 08:23:47.682944 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerName="dnsmasq-dns" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.682962 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerName="dnsmasq-dns" Oct 10 08:23:47 crc kubenswrapper[4732]: E1010 08:23:47.682969 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerName="init" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.682976 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerName="init" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.683160 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e501ef-9f03-4ca9-b9f7-425bb81a3435" containerName="dnsmasq-dns" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.684350 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.699337 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-869rb"] Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.811485 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzp4\" (UniqueName: \"kubernetes.io/projected/01194045-e744-4a03-8949-7ae6153309f0-kube-api-access-lwzp4\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.811564 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-catalog-content\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.811766 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-utilities\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.917635 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-utilities\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.917914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzp4\" (UniqueName: \"kubernetes.io/projected/01194045-e744-4a03-8949-7ae6153309f0-kube-api-access-lwzp4\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.917990 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-catalog-content\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.918782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-catalog-content\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.919092 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-utilities\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:47 crc kubenswrapper[4732]: I1010 08:23:47.940357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzp4\" (UniqueName: \"kubernetes.io/projected/01194045-e744-4a03-8949-7ae6153309f0-kube-api-access-lwzp4\") pod \"community-operators-869rb\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:48 crc kubenswrapper[4732]: I1010 08:23:48.007426 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:48 crc kubenswrapper[4732]: I1010 08:23:48.588084 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-869rb"] Oct 10 08:23:48 crc kubenswrapper[4732]: I1010 08:23:48.602031 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-869rb" event={"ID":"01194045-e744-4a03-8949-7ae6153309f0","Type":"ContainerStarted","Data":"3b8e54c8856827d64f2c26f8c5f220585d616abd50c791baf722bd526704fe2d"} Oct 10 08:23:48 crc kubenswrapper[4732]: I1010 08:23:48.697479 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 08:23:48 crc kubenswrapper[4732]: I1010 08:23:48.697837 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 08:23:48 crc kubenswrapper[4732]: I1010 08:23:48.702188 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 08:23:49 crc kubenswrapper[4732]: I1010 08:23:49.626667 4732 generic.go:334] "Generic (PLEG): container finished" podID="01194045-e744-4a03-8949-7ae6153309f0" containerID="0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b" exitCode=0 Oct 10 08:23:49 crc kubenswrapper[4732]: I1010 08:23:49.629656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-869rb" event={"ID":"01194045-e744-4a03-8949-7ae6153309f0","Type":"ContainerDied","Data":"0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b"} Oct 10 08:23:50 crc kubenswrapper[4732]: I1010 08:23:50.154721 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:50 crc kubenswrapper[4732]: I1010 08:23:50.155089 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 08:23:50 crc kubenswrapper[4732]: I1010 08:23:50.639923 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-869rb" event={"ID":"01194045-e744-4a03-8949-7ae6153309f0","Type":"ContainerStarted","Data":"4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f"} Oct 10 08:23:50 crc kubenswrapper[4732]: I1010 08:23:50.994937 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 08:23:51 crc kubenswrapper[4732]: I1010 08:23:51.652551 4732 generic.go:334] "Generic (PLEG): container finished" podID="01194045-e744-4a03-8949-7ae6153309f0" containerID="4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f" exitCode=0 Oct 10 08:23:51 crc kubenswrapper[4732]: I1010 08:23:51.652650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-869rb" event={"ID":"01194045-e744-4a03-8949-7ae6153309f0","Type":"ContainerDied","Data":"4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f"} Oct 10 08:23:52 crc kubenswrapper[4732]: I1010 08:23:52.664302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-869rb" event={"ID":"01194045-e744-4a03-8949-7ae6153309f0","Type":"ContainerStarted","Data":"d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95"} Oct 10 08:23:52 crc kubenswrapper[4732]: I1010 08:23:52.693382 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-869rb" podStartSLOduration=3.085349771 podStartE2EDuration="5.693361617s" podCreationTimestamp="2025-10-10 08:23:47 +0000 UTC" firstStartedPulling="2025-10-10 08:23:49.63490545 +0000 UTC m=+5556.704496691" lastFinishedPulling="2025-10-10 08:23:52.242917286 +0000 UTC m=+5559.312508537" observedRunningTime="2025-10-10 08:23:52.68346076 +0000 UTC m=+5559.753052021" watchObservedRunningTime="2025-10-10 08:23:52.693361617 +0000 UTC m=+5559.762952868" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.075434 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbk54"] Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.079627 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.088196 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbk54"] Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.161500 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rkj\" (UniqueName: \"kubernetes.io/projected/7119b4da-dada-424e-af9f-eb1b4846eacd-kube-api-access-z7rkj\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.161917 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-utilities\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.162232 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-catalog-content\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.264431 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rkj\" (UniqueName: \"kubernetes.io/projected/7119b4da-dada-424e-af9f-eb1b4846eacd-kube-api-access-z7rkj\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.264494 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-utilities\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.264592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-catalog-content\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.265134 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-catalog-content\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.265268 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-utilities\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.283628 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rkj\" (UniqueName: \"kubernetes.io/projected/7119b4da-dada-424e-af9f-eb1b4846eacd-kube-api-access-z7rkj\") pod \"certified-operators-bbk54\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.356332 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.356396 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.408244 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:23:55 crc kubenswrapper[4732]: I1010 08:23:55.930049 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbk54"] Oct 10 08:23:55 crc kubenswrapper[4732]: W1010 08:23:55.935332 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7119b4da_dada_424e_af9f_eb1b4846eacd.slice/crio-bfc8d108d18ad8dbb6b5a7cb2a918f95dcf76fd421f1511b126b4f29dc67c6a2 WatchSource:0}: Error finding container bfc8d108d18ad8dbb6b5a7cb2a918f95dcf76fd421f1511b126b4f29dc67c6a2: Status 404 returned error can't find the container with id bfc8d108d18ad8dbb6b5a7cb2a918f95dcf76fd421f1511b126b4f29dc67c6a2 Oct 10 08:23:56 crc kubenswrapper[4732]: I1010 08:23:56.725423 4732 generic.go:334] "Generic (PLEG): container finished" podID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerID="9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b" exitCode=0 Oct 10 08:23:56 crc kubenswrapper[4732]: I1010 08:23:56.725561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbk54" event={"ID":"7119b4da-dada-424e-af9f-eb1b4846eacd","Type":"ContainerDied","Data":"9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b"} Oct 10 08:23:56 crc kubenswrapper[4732]: I1010 08:23:56.725760 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbk54" event={"ID":"7119b4da-dada-424e-af9f-eb1b4846eacd","Type":"ContainerStarted","Data":"bfc8d108d18ad8dbb6b5a7cb2a918f95dcf76fd421f1511b126b4f29dc67c6a2"} Oct 10 08:23:57 crc kubenswrapper[4732]: I1010 08:23:57.738416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbk54" event={"ID":"7119b4da-dada-424e-af9f-eb1b4846eacd","Type":"ContainerStarted","Data":"50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80"} Oct 10 08:23:58 crc kubenswrapper[4732]: I1010 08:23:58.008595 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:58 crc kubenswrapper[4732]: I1010 08:23:58.008651 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:23:58 crc kubenswrapper[4732]: I1010 08:23:58.750533 4732 generic.go:334] "Generic (PLEG): container finished" podID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerID="50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80" exitCode=0 Oct 10 08:23:58 crc kubenswrapper[4732]: I1010 08:23:58.750614 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbk54" event={"ID":"7119b4da-dada-424e-af9f-eb1b4846eacd","Type":"ContainerDied","Data":"50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80"} Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.073637 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-869rb" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="registry-server" probeResult="failure" output=< Oct 10 08:23:59 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 08:23:59 crc kubenswrapper[4732]: > Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.520144 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-bnt72"] Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.521800 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bnt72" Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.530947 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bnt72"] Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.694658 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2299n\" (UniqueName: \"kubernetes.io/projected/2ce602c1-4d4d-40ce-8712-c0a621e288b0-kube-api-access-2299n\") pod \"placement-db-create-bnt72\" (UID: \"2ce602c1-4d4d-40ce-8712-c0a621e288b0\") " pod="openstack/placement-db-create-bnt72" Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.760719 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbk54" event={"ID":"7119b4da-dada-424e-af9f-eb1b4846eacd","Type":"ContainerStarted","Data":"6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f"} Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.793009 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbk54" podStartSLOduration=2.322514815 podStartE2EDuration="4.792972571s" podCreationTimestamp="2025-10-10 08:23:55 +0000 UTC" firstStartedPulling="2025-10-10 08:23:56.735193101 +0000 UTC m=+5563.804784332" lastFinishedPulling="2025-10-10 08:23:59.205650837 +0000 UTC m=+5566.275242088" observedRunningTime="2025-10-10 08:23:59.783543087 +0000 UTC m=+5566.853134348" watchObservedRunningTime="2025-10-10 08:23:59.792972571 +0000 UTC m=+5566.862563832" Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.798910 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2299n\" (UniqueName: \"kubernetes.io/projected/2ce602c1-4d4d-40ce-8712-c0a621e288b0-kube-api-access-2299n\") pod \"placement-db-create-bnt72\" (UID: \"2ce602c1-4d4d-40ce-8712-c0a621e288b0\") " pod="openstack/placement-db-create-bnt72" Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.832632 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2299n\" (UniqueName: \"kubernetes.io/projected/2ce602c1-4d4d-40ce-8712-c0a621e288b0-kube-api-access-2299n\") pod \"placement-db-create-bnt72\" (UID: \"2ce602c1-4d4d-40ce-8712-c0a621e288b0\") " pod="openstack/placement-db-create-bnt72" Oct 10 08:23:59 crc kubenswrapper[4732]: I1010 08:23:59.852417 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bnt72" Oct 10 08:24:00 crc kubenswrapper[4732]: I1010 08:24:00.331569 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-bnt72"] Oct 10 08:24:00 crc kubenswrapper[4732]: I1010 08:24:00.771107 4732 generic.go:334] "Generic (PLEG): container finished" podID="2ce602c1-4d4d-40ce-8712-c0a621e288b0" containerID="5a16c08c0398ebf8deb1bd17987e4d6b9fe7191d41ef8684d7ecb87beed51d98" exitCode=0 Oct 10 08:24:00 crc kubenswrapper[4732]: I1010 08:24:00.771169 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bnt72" event={"ID":"2ce602c1-4d4d-40ce-8712-c0a621e288b0","Type":"ContainerDied","Data":"5a16c08c0398ebf8deb1bd17987e4d6b9fe7191d41ef8684d7ecb87beed51d98"} Oct 10 08:24:00 crc kubenswrapper[4732]: I1010 08:24:00.771438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bnt72" event={"ID":"2ce602c1-4d4d-40ce-8712-c0a621e288b0","Type":"ContainerStarted","Data":"6808eb4a0bf225adb97f2d5f5fc20d1a7caa0665d9b9c76652daa0d14b4133b7"} Oct 10 08:24:02 crc kubenswrapper[4732]: I1010 08:24:02.216065 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bnt72" Oct 10 08:24:02 crc kubenswrapper[4732]: I1010 08:24:02.346032 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2299n\" (UniqueName: \"kubernetes.io/projected/2ce602c1-4d4d-40ce-8712-c0a621e288b0-kube-api-access-2299n\") pod \"2ce602c1-4d4d-40ce-8712-c0a621e288b0\" (UID: \"2ce602c1-4d4d-40ce-8712-c0a621e288b0\") " Oct 10 08:24:02 crc kubenswrapper[4732]: I1010 08:24:02.352043 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce602c1-4d4d-40ce-8712-c0a621e288b0-kube-api-access-2299n" (OuterVolumeSpecName: "kube-api-access-2299n") pod "2ce602c1-4d4d-40ce-8712-c0a621e288b0" (UID: "2ce602c1-4d4d-40ce-8712-c0a621e288b0"). InnerVolumeSpecName "kube-api-access-2299n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:24:02 crc kubenswrapper[4732]: I1010 08:24:02.448742 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2299n\" (UniqueName: \"kubernetes.io/projected/2ce602c1-4d4d-40ce-8712-c0a621e288b0-kube-api-access-2299n\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:02 crc kubenswrapper[4732]: I1010 08:24:02.808203 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-bnt72" event={"ID":"2ce602c1-4d4d-40ce-8712-c0a621e288b0","Type":"ContainerDied","Data":"6808eb4a0bf225adb97f2d5f5fc20d1a7caa0665d9b9c76652daa0d14b4133b7"} Oct 10 08:24:02 crc kubenswrapper[4732]: I1010 08:24:02.808454 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6808eb4a0bf225adb97f2d5f5fc20d1a7caa0665d9b9c76652daa0d14b4133b7" Oct 10 08:24:02 crc kubenswrapper[4732]: I1010 08:24:02.808250 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-bnt72" Oct 10 08:24:05 crc kubenswrapper[4732]: I1010 08:24:05.408405 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:24:05 crc kubenswrapper[4732]: I1010 08:24:05.409621 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:24:05 crc kubenswrapper[4732]: I1010 08:24:05.477598 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:24:05 crc kubenswrapper[4732]: I1010 08:24:05.893056 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:24:05 crc kubenswrapper[4732]: I1010 08:24:05.945844 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbk54"] Oct 10 08:24:07 crc kubenswrapper[4732]: I1010 08:24:07.860164 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bbk54" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="registry-server" containerID="cri-o://6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f" gracePeriod=2 Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.087108 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.159954 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.367437 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.460096 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-utilities\") pod \"7119b4da-dada-424e-af9f-eb1b4846eacd\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.460208 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-catalog-content\") pod \"7119b4da-dada-424e-af9f-eb1b4846eacd\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.460384 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7rkj\" (UniqueName: \"kubernetes.io/projected/7119b4da-dada-424e-af9f-eb1b4846eacd-kube-api-access-z7rkj\") pod \"7119b4da-dada-424e-af9f-eb1b4846eacd\" (UID: \"7119b4da-dada-424e-af9f-eb1b4846eacd\") " Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.461071 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-utilities" (OuterVolumeSpecName: "utilities") pod "7119b4da-dada-424e-af9f-eb1b4846eacd" (UID: "7119b4da-dada-424e-af9f-eb1b4846eacd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.467528 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7119b4da-dada-424e-af9f-eb1b4846eacd-kube-api-access-z7rkj" (OuterVolumeSpecName: "kube-api-access-z7rkj") pod "7119b4da-dada-424e-af9f-eb1b4846eacd" (UID: "7119b4da-dada-424e-af9f-eb1b4846eacd"). InnerVolumeSpecName "kube-api-access-z7rkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.503107 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7119b4da-dada-424e-af9f-eb1b4846eacd" (UID: "7119b4da-dada-424e-af9f-eb1b4846eacd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.562078 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7rkj\" (UniqueName: \"kubernetes.io/projected/7119b4da-dada-424e-af9f-eb1b4846eacd-kube-api-access-z7rkj\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.562112 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.562121 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7119b4da-dada-424e-af9f-eb1b4846eacd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.874070 4732 generic.go:334] "Generic (PLEG): container finished" podID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerID="6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f" exitCode=0 Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.874113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbk54" event={"ID":"7119b4da-dada-424e-af9f-eb1b4846eacd","Type":"ContainerDied","Data":"6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f"} Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.874157 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbk54" event={"ID":"7119b4da-dada-424e-af9f-eb1b4846eacd","Type":"ContainerDied","Data":"bfc8d108d18ad8dbb6b5a7cb2a918f95dcf76fd421f1511b126b4f29dc67c6a2"} Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.874162 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbk54" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.874174 4732 scope.go:117] "RemoveContainer" containerID="6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.897392 4732 scope.go:117] "RemoveContainer" containerID="50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.912481 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbk54"] Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.919547 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bbk54"] Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.935665 4732 scope.go:117] "RemoveContainer" containerID="9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.976984 4732 scope.go:117] "RemoveContainer" containerID="6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f" Oct 10 08:24:08 crc kubenswrapper[4732]: E1010 08:24:08.977478 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f\": container with ID starting with 6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f not found: ID does not exist" containerID="6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.977687 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f"} err="failed to get container status \"6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f\": rpc error: code = NotFound desc = could not find container \"6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f\": container with ID starting with 6e74c3c3d6a73e59f1a0282bf38326c74f6f2289c9e42b92129c89e1e9af253f not found: ID does not exist" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.977880 4732 scope.go:117] "RemoveContainer" containerID="50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80" Oct 10 08:24:08 crc kubenswrapper[4732]: E1010 08:24:08.978531 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80\": container with ID starting with 50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80 not found: ID does not exist" containerID="50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.978558 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80"} err="failed to get container status \"50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80\": rpc error: code = NotFound desc = could not find container \"50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80\": container with ID starting with 50d19b8f63061ad61c7e77e2c5196b3d4bb192baa603b459ea39cb9acc6c6d80 not found: ID does not exist" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.978580 4732 scope.go:117] "RemoveContainer" containerID="9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b" Oct 10 08:24:08 crc kubenswrapper[4732]: E1010 08:24:08.979175 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b\": container with ID starting with 9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b not found: ID does not exist" containerID="9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b" Oct 10 08:24:08 crc kubenswrapper[4732]: I1010 08:24:08.979363 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b"} err="failed to get container status \"9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b\": rpc error: code = NotFound desc = could not find container \"9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b\": container with ID starting with 9552f9d1718dec2b4f54180f5b3f0d1d7c56acb64a2162bb5306e48df670778b not found: ID does not exist" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.678295 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" path="/var/lib/kubelet/pods/7119b4da-dada-424e-af9f-eb1b4846eacd/volumes" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.679031 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0b3a-account-create-6ch6l"] Oct 10 08:24:09 crc kubenswrapper[4732]: E1010 08:24:09.679372 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce602c1-4d4d-40ce-8712-c0a621e288b0" containerName="mariadb-database-create" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.679388 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce602c1-4d4d-40ce-8712-c0a621e288b0" containerName="mariadb-database-create" Oct 10 08:24:09 crc kubenswrapper[4732]: E1010 08:24:09.679415 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="registry-server" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.679422 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="registry-server" Oct 10 08:24:09 crc kubenswrapper[4732]: E1010 08:24:09.679445 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="extract-content" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.679452 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="extract-content" Oct 10 08:24:09 crc kubenswrapper[4732]: E1010 08:24:09.679460 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="extract-utilities" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.679467 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="extract-utilities" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.679676 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7119b4da-dada-424e-af9f-eb1b4846eacd" containerName="registry-server" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.679694 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce602c1-4d4d-40ce-8712-c0a621e288b0" containerName="mariadb-database-create" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.686579 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0b3a-account-create-6ch6l" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.689418 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.691011 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0b3a-account-create-6ch6l"] Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.734944 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-869rb"] Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.785126 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl64m\" (UniqueName: \"kubernetes.io/projected/5049d408-b2a1-43a6-a2de-9516b7d0c78e-kube-api-access-sl64m\") pod \"placement-0b3a-account-create-6ch6l\" (UID: \"5049d408-b2a1-43a6-a2de-9516b7d0c78e\") " pod="openstack/placement-0b3a-account-create-6ch6l" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.886766 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl64m\" (UniqueName: \"kubernetes.io/projected/5049d408-b2a1-43a6-a2de-9516b7d0c78e-kube-api-access-sl64m\") pod \"placement-0b3a-account-create-6ch6l\" (UID: \"5049d408-b2a1-43a6-a2de-9516b7d0c78e\") " pod="openstack/placement-0b3a-account-create-6ch6l" Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.890110 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-869rb" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="registry-server" containerID="cri-o://d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95" gracePeriod=2 Oct 10 08:24:09 crc kubenswrapper[4732]: I1010 08:24:09.913071 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl64m\" (UniqueName: \"kubernetes.io/projected/5049d408-b2a1-43a6-a2de-9516b7d0c78e-kube-api-access-sl64m\") pod \"placement-0b3a-account-create-6ch6l\" (UID: \"5049d408-b2a1-43a6-a2de-9516b7d0c78e\") " pod="openstack/placement-0b3a-account-create-6ch6l" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.001313 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0b3a-account-create-6ch6l" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.411189 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.502740 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwzp4\" (UniqueName: \"kubernetes.io/projected/01194045-e744-4a03-8949-7ae6153309f0-kube-api-access-lwzp4\") pod \"01194045-e744-4a03-8949-7ae6153309f0\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.503043 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-catalog-content\") pod \"01194045-e744-4a03-8949-7ae6153309f0\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.503119 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-utilities\") pod \"01194045-e744-4a03-8949-7ae6153309f0\" (UID: \"01194045-e744-4a03-8949-7ae6153309f0\") " Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.503952 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-utilities" (OuterVolumeSpecName: "utilities") pod "01194045-e744-4a03-8949-7ae6153309f0" (UID: "01194045-e744-4a03-8949-7ae6153309f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.517091 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01194045-e744-4a03-8949-7ae6153309f0-kube-api-access-lwzp4" (OuterVolumeSpecName: "kube-api-access-lwzp4") pod "01194045-e744-4a03-8949-7ae6153309f0" (UID: "01194045-e744-4a03-8949-7ae6153309f0"). InnerVolumeSpecName "kube-api-access-lwzp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.520141 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0b3a-account-create-6ch6l"] Oct 10 08:24:10 crc kubenswrapper[4732]: W1010 08:24:10.528562 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5049d408_b2a1_43a6_a2de_9516b7d0c78e.slice/crio-4fea846bd2fbae8f74f8f20a4a0d9b8a96c2f56bcd13a31ea9fe8919681ae434 WatchSource:0}: Error finding container 4fea846bd2fbae8f74f8f20a4a0d9b8a96c2f56bcd13a31ea9fe8919681ae434: Status 404 returned error can't find the container with id 4fea846bd2fbae8f74f8f20a4a0d9b8a96c2f56bcd13a31ea9fe8919681ae434 Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.557286 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01194045-e744-4a03-8949-7ae6153309f0" (UID: "01194045-e744-4a03-8949-7ae6153309f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.605056 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.605129 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01194045-e744-4a03-8949-7ae6153309f0-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.605142 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwzp4\" (UniqueName: \"kubernetes.io/projected/01194045-e744-4a03-8949-7ae6153309f0-kube-api-access-lwzp4\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.906126 4732 generic.go:334] "Generic (PLEG): container finished" podID="01194045-e744-4a03-8949-7ae6153309f0" containerID="d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95" exitCode=0 Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.906219 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-869rb" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.907566 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-869rb" event={"ID":"01194045-e744-4a03-8949-7ae6153309f0","Type":"ContainerDied","Data":"d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95"} Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.908408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-869rb" event={"ID":"01194045-e744-4a03-8949-7ae6153309f0","Type":"ContainerDied","Data":"3b8e54c8856827d64f2c26f8c5f220585d616abd50c791baf722bd526704fe2d"} Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.908603 4732 scope.go:117] "RemoveContainer" containerID="d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.912016 4732 generic.go:334] "Generic (PLEG): container finished" podID="5049d408-b2a1-43a6-a2de-9516b7d0c78e" containerID="5eb88d0285472497021f9892bac755c67c0bec73c260aad2b46c6243b83d3d87" exitCode=0 Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.912071 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0b3a-account-create-6ch6l" event={"ID":"5049d408-b2a1-43a6-a2de-9516b7d0c78e","Type":"ContainerDied","Data":"5eb88d0285472497021f9892bac755c67c0bec73c260aad2b46c6243b83d3d87"} Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.912104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0b3a-account-create-6ch6l" event={"ID":"5049d408-b2a1-43a6-a2de-9516b7d0c78e","Type":"ContainerStarted","Data":"4fea846bd2fbae8f74f8f20a4a0d9b8a96c2f56bcd13a31ea9fe8919681ae434"} Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.947114 4732 scope.go:117] "RemoveContainer" containerID="4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f" Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.956175 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-869rb"] Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.965929 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-869rb"] Oct 10 08:24:10 crc kubenswrapper[4732]: I1010 08:24:10.976819 4732 scope.go:117] "RemoveContainer" containerID="0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b" Oct 10 08:24:11 crc kubenswrapper[4732]: I1010 08:24:11.000841 4732 scope.go:117] "RemoveContainer" containerID="d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95" Oct 10 08:24:11 crc kubenswrapper[4732]: E1010 08:24:11.001343 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95\": container with ID starting with d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95 not found: ID does not exist" containerID="d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95" Oct 10 08:24:11 crc kubenswrapper[4732]: I1010 08:24:11.001379 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95"} err="failed to get container status \"d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95\": rpc error: code = NotFound desc = could not find container \"d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95\": container with ID starting with d3fed5e131fa7416c3bbb4bd53aa93c13acb4895e87391aec0bd7e5e8ce6ca95 not found: ID does not exist" Oct 10 08:24:11 crc kubenswrapper[4732]: I1010 08:24:11.001405 4732 scope.go:117] "RemoveContainer" containerID="4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f" Oct 10 08:24:11 crc kubenswrapper[4732]: E1010 08:24:11.001782 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f\": container with ID starting with 4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f not found: ID does not exist" containerID="4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f" Oct 10 08:24:11 crc kubenswrapper[4732]: I1010 08:24:11.001812 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f"} err="failed to get container status \"4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f\": rpc error: code = NotFound desc = could not find container \"4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f\": container with ID starting with 4a01657d43fee258ab78068414e4b9d3a14317c375180c2a1803fee02f94325f not found: ID does not exist" Oct 10 08:24:11 crc kubenswrapper[4732]: I1010 08:24:11.001830 4732 scope.go:117] "RemoveContainer" containerID="0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b" Oct 10 08:24:11 crc kubenswrapper[4732]: E1010 08:24:11.002510 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b\": container with ID starting with 0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b not found: ID does not exist" containerID="0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b" Oct 10 08:24:11 crc kubenswrapper[4732]: I1010 08:24:11.002599 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b"} err="failed to get container status \"0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b\": rpc error: code = NotFound desc = could not find container \"0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b\": container with ID starting with 0117c8b65dcaaf766a8543b13b1a42da1b4ef3f6073470151cf4118a5d208b9b not found: ID does not exist" Oct 10 08:24:11 crc kubenswrapper[4732]: I1010 08:24:11.679067 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01194045-e744-4a03-8949-7ae6153309f0" path="/var/lib/kubelet/pods/01194045-e744-4a03-8949-7ae6153309f0/volumes" Oct 10 08:24:12 crc kubenswrapper[4732]: I1010 08:24:12.297756 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0b3a-account-create-6ch6l" Oct 10 08:24:12 crc kubenswrapper[4732]: I1010 08:24:12.343620 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl64m\" (UniqueName: \"kubernetes.io/projected/5049d408-b2a1-43a6-a2de-9516b7d0c78e-kube-api-access-sl64m\") pod \"5049d408-b2a1-43a6-a2de-9516b7d0c78e\" (UID: \"5049d408-b2a1-43a6-a2de-9516b7d0c78e\") " Oct 10 08:24:12 crc kubenswrapper[4732]: I1010 08:24:12.352811 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5049d408-b2a1-43a6-a2de-9516b7d0c78e-kube-api-access-sl64m" (OuterVolumeSpecName: "kube-api-access-sl64m") pod "5049d408-b2a1-43a6-a2de-9516b7d0c78e" (UID: "5049d408-b2a1-43a6-a2de-9516b7d0c78e"). InnerVolumeSpecName "kube-api-access-sl64m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:24:12 crc kubenswrapper[4732]: I1010 08:24:12.445472 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl64m\" (UniqueName: \"kubernetes.io/projected/5049d408-b2a1-43a6-a2de-9516b7d0c78e-kube-api-access-sl64m\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:12 crc kubenswrapper[4732]: I1010 08:24:12.939197 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0b3a-account-create-6ch6l" event={"ID":"5049d408-b2a1-43a6-a2de-9516b7d0c78e","Type":"ContainerDied","Data":"4fea846bd2fbae8f74f8f20a4a0d9b8a96c2f56bcd13a31ea9fe8919681ae434"} Oct 10 08:24:12 crc kubenswrapper[4732]: I1010 08:24:12.939238 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fea846bd2fbae8f74f8f20a4a0d9b8a96c2f56bcd13a31ea9fe8919681ae434" Oct 10 08:24:12 crc kubenswrapper[4732]: I1010 08:24:12.939242 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0b3a-account-create-6ch6l" Oct 10 08:24:13 crc kubenswrapper[4732]: E1010 08:24:13.015833 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5049d408_b2a1_43a6_a2de_9516b7d0c78e.slice\": RecentStats: unable to find data in memory cache]" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.920134 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b45676f95-p75gt"] Oct 10 08:24:14 crc kubenswrapper[4732]: E1010 08:24:14.920847 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="extract-content" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.920862 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="extract-content" Oct 10 08:24:14 crc kubenswrapper[4732]: E1010 08:24:14.920882 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5049d408-b2a1-43a6-a2de-9516b7d0c78e" containerName="mariadb-account-create" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.920891 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5049d408-b2a1-43a6-a2de-9516b7d0c78e" containerName="mariadb-account-create" Oct 10 08:24:14 crc kubenswrapper[4732]: E1010 08:24:14.920913 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="registry-server" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.920921 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="registry-server" Oct 10 08:24:14 crc kubenswrapper[4732]: E1010 08:24:14.920936 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="extract-utilities" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.920944 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="extract-utilities" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.921175 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5049d408-b2a1-43a6-a2de-9516b7d0c78e" containerName="mariadb-account-create" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.921199 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="01194045-e744-4a03-8949-7ae6153309f0" containerName="registry-server" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.922312 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.940647 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b45676f95-p75gt"] Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.947718 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-lw22c"] Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.949174 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.952051 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.956528 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pbsbm" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.956832 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.984206 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lw22c"] Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.992760 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-nb\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.992807 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln72m\" (UniqueName: \"kubernetes.io/projected/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-kube-api-access-ln72m\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.992838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf2j5\" (UniqueName: \"kubernetes.io/projected/651a24c2-d598-4be7-82e7-676ae6360537-kube-api-access-sf2j5\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.992901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-scripts\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.993014 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-config-data\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.993075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-dns-svc\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.993134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-combined-ca-bundle\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.993224 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-sb\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.993286 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-config\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:14 crc kubenswrapper[4732]: I1010 08:24:14.993319 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-logs\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.094621 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-config\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095000 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-logs\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095043 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-nb\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095070 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln72m\" (UniqueName: \"kubernetes.io/projected/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-kube-api-access-ln72m\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095098 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf2j5\" (UniqueName: \"kubernetes.io/projected/651a24c2-d598-4be7-82e7-676ae6360537-kube-api-access-sf2j5\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-scripts\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095201 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-config-data\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095259 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-dns-svc\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095301 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-combined-ca-bundle\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-sb\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.095636 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-config\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.096261 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-sb\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.096289 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-logs\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.097875 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-dns-svc\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.099611 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-nb\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.105414 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-scripts\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.105509 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-combined-ca-bundle\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.105556 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-config-data\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.115665 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln72m\" (UniqueName: \"kubernetes.io/projected/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-kube-api-access-ln72m\") pod \"placement-db-sync-lw22c\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.124513 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf2j5\" (UniqueName: \"kubernetes.io/projected/651a24c2-d598-4be7-82e7-676ae6360537-kube-api-access-sf2j5\") pod \"dnsmasq-dns-b45676f95-p75gt\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.242901 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.271395 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.733942 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b45676f95-p75gt"] Oct 10 08:24:15 crc kubenswrapper[4732]: W1010 08:24:15.738753 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod651a24c2_d598_4be7_82e7_676ae6360537.slice/crio-af74bdb254bf640a59c0a6a6729670299e0a66206aa81b5b24f806808609a148 WatchSource:0}: Error finding container af74bdb254bf640a59c0a6a6729670299e0a66206aa81b5b24f806808609a148: Status 404 returned error can't find the container with id af74bdb254bf640a59c0a6a6729670299e0a66206aa81b5b24f806808609a148 Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.791011 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-lw22c"] Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.966376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lw22c" event={"ID":"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5","Type":"ContainerStarted","Data":"a3fca4916fddb6f7e0b6d5ed8749bb751c0222890602e2d324490d631050ae6a"} Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.967975 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b45676f95-p75gt" event={"ID":"651a24c2-d598-4be7-82e7-676ae6360537","Type":"ContainerStarted","Data":"ee89d2f7cf7fc80918fd18cc58f60aca68859a37842864f5aa0235db3b2dab69"} Oct 10 08:24:15 crc kubenswrapper[4732]: I1010 08:24:15.967998 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b45676f95-p75gt" event={"ID":"651a24c2-d598-4be7-82e7-676ae6360537","Type":"ContainerStarted","Data":"af74bdb254bf640a59c0a6a6729670299e0a66206aa81b5b24f806808609a148"} Oct 10 08:24:16 crc kubenswrapper[4732]: I1010 08:24:16.979679 4732 generic.go:334] "Generic (PLEG): container finished" podID="651a24c2-d598-4be7-82e7-676ae6360537" containerID="ee89d2f7cf7fc80918fd18cc58f60aca68859a37842864f5aa0235db3b2dab69" exitCode=0 Oct 10 08:24:16 crc kubenswrapper[4732]: I1010 08:24:16.979809 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b45676f95-p75gt" event={"ID":"651a24c2-d598-4be7-82e7-676ae6360537","Type":"ContainerDied","Data":"ee89d2f7cf7fc80918fd18cc58f60aca68859a37842864f5aa0235db3b2dab69"} Oct 10 08:24:17 crc kubenswrapper[4732]: I1010 08:24:17.992017 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b45676f95-p75gt" event={"ID":"651a24c2-d598-4be7-82e7-676ae6360537","Type":"ContainerStarted","Data":"5d40840329fc7c583e9aad24c61f27064823d79014a39fe4313e9dc62d092abf"} Oct 10 08:24:17 crc kubenswrapper[4732]: I1010 08:24:17.992390 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:18 crc kubenswrapper[4732]: I1010 08:24:18.018550 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b45676f95-p75gt" podStartSLOduration=4.018526044 podStartE2EDuration="4.018526044s" podCreationTimestamp="2025-10-10 08:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:24:18.015005309 +0000 UTC m=+5585.084596570" watchObservedRunningTime="2025-10-10 08:24:18.018526044 +0000 UTC m=+5585.088117305" Oct 10 08:24:20 crc kubenswrapper[4732]: I1010 08:24:20.010639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lw22c" event={"ID":"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5","Type":"ContainerStarted","Data":"59704af831b3620080dac096d92407b5de47d17447bf6674c20039cdd019fc6b"} Oct 10 08:24:20 crc kubenswrapper[4732]: I1010 08:24:20.047032 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-lw22c" podStartSLOduration=2.583345478 podStartE2EDuration="6.047008476s" podCreationTimestamp="2025-10-10 08:24:14 +0000 UTC" firstStartedPulling="2025-10-10 08:24:15.799750689 +0000 UTC m=+5582.869341920" lastFinishedPulling="2025-10-10 08:24:19.263413677 +0000 UTC m=+5586.333004918" observedRunningTime="2025-10-10 08:24:20.033973165 +0000 UTC m=+5587.103564436" watchObservedRunningTime="2025-10-10 08:24:20.047008476 +0000 UTC m=+5587.116599737" Oct 10 08:24:21 crc kubenswrapper[4732]: I1010 08:24:21.025976 4732 generic.go:334] "Generic (PLEG): container finished" podID="c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" containerID="59704af831b3620080dac096d92407b5de47d17447bf6674c20039cdd019fc6b" exitCode=0 Oct 10 08:24:21 crc kubenswrapper[4732]: I1010 08:24:21.026047 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lw22c" event={"ID":"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5","Type":"ContainerDied","Data":"59704af831b3620080dac096d92407b5de47d17447bf6674c20039cdd019fc6b"} Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.466747 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.641193 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-combined-ca-bundle\") pod \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.641340 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-config-data\") pod \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.641433 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-scripts\") pod \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.641559 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-logs\") pod \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.641798 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln72m\" (UniqueName: \"kubernetes.io/projected/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-kube-api-access-ln72m\") pod \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\" (UID: \"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5\") " Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.642785 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-logs" (OuterVolumeSpecName: "logs") pod "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" (UID: "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.648057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-kube-api-access-ln72m" (OuterVolumeSpecName: "kube-api-access-ln72m") pod "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" (UID: "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5"). InnerVolumeSpecName "kube-api-access-ln72m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.648462 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-scripts" (OuterVolumeSpecName: "scripts") pod "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" (UID: "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.674210 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-config-data" (OuterVolumeSpecName: "config-data") pod "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" (UID: "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.679785 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" (UID: "c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.744765 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.745158 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln72m\" (UniqueName: \"kubernetes.io/projected/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-kube-api-access-ln72m\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.745247 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.745338 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:22 crc kubenswrapper[4732]: I1010 08:24:22.745408 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.069272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-lw22c" event={"ID":"c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5","Type":"ContainerDied","Data":"a3fca4916fddb6f7e0b6d5ed8749bb751c0222890602e2d324490d631050ae6a"} Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.069337 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3fca4916fddb6f7e0b6d5ed8749bb751c0222890602e2d324490d631050ae6a" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.069421 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-lw22c" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.148950 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-697b9444d8-dst4f"] Oct 10 08:24:23 crc kubenswrapper[4732]: E1010 08:24:23.149575 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" containerName="placement-db-sync" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.149603 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" containerName="placement-db-sync" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.149859 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" containerName="placement-db-sync" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.151197 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.156842 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.157474 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.157795 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.158129 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.159225 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pbsbm" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.163208 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-697b9444d8-dst4f"] Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.257585 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-config-data\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.257632 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-public-tls-certs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.257676 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-internal-tls-certs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.257745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-combined-ca-bundle\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.257773 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba9debd-ecde-489e-af5e-bbd2b4d0321f-logs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.257949 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-scripts\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.258186 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sklk4\" (UniqueName: \"kubernetes.io/projected/cba9debd-ecde-489e-af5e-bbd2b4d0321f-kube-api-access-sklk4\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: E1010 08:24:23.279772 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b7102b_f80d_4b3a_8c8d_1baaedf67ca5.slice/crio-a3fca4916fddb6f7e0b6d5ed8749bb751c0222890602e2d324490d631050ae6a\": RecentStats: unable to find data in memory cache]" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.360374 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-scripts\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.360916 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sklk4\" (UniqueName: \"kubernetes.io/projected/cba9debd-ecde-489e-af5e-bbd2b4d0321f-kube-api-access-sklk4\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.361006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-config-data\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.361038 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-public-tls-certs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.361067 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-internal-tls-certs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.361097 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-combined-ca-bundle\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.361132 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba9debd-ecde-489e-af5e-bbd2b4d0321f-logs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.361988 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba9debd-ecde-489e-af5e-bbd2b4d0321f-logs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.367249 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-scripts\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.367498 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-public-tls-certs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.368995 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-config-data\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.371179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-internal-tls-certs\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.394613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba9debd-ecde-489e-af5e-bbd2b4d0321f-combined-ca-bundle\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.409331 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sklk4\" (UniqueName: \"kubernetes.io/projected/cba9debd-ecde-489e-af5e-bbd2b4d0321f-kube-api-access-sklk4\") pod \"placement-697b9444d8-dst4f\" (UID: \"cba9debd-ecde-489e-af5e-bbd2b4d0321f\") " pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.483770 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:23 crc kubenswrapper[4732]: I1010 08:24:23.745593 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-697b9444d8-dst4f"] Oct 10 08:24:23 crc kubenswrapper[4732]: W1010 08:24:23.747555 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba9debd_ecde_489e_af5e_bbd2b4d0321f.slice/crio-1755f39cb05921beb8b7cc7fbc1bca7c0c9a368a983aba2bc2def03ce9cd0e4d WatchSource:0}: Error finding container 1755f39cb05921beb8b7cc7fbc1bca7c0c9a368a983aba2bc2def03ce9cd0e4d: Status 404 returned error can't find the container with id 1755f39cb05921beb8b7cc7fbc1bca7c0c9a368a983aba2bc2def03ce9cd0e4d Oct 10 08:24:24 crc kubenswrapper[4732]: I1010 08:24:24.079362 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-697b9444d8-dst4f" event={"ID":"cba9debd-ecde-489e-af5e-bbd2b4d0321f","Type":"ContainerStarted","Data":"b9f016ffa8590f465e98223ef251021ffe676675c86d446ca8c2be2b1e042a93"} Oct 10 08:24:24 crc kubenswrapper[4732]: I1010 08:24:24.079407 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-697b9444d8-dst4f" event={"ID":"cba9debd-ecde-489e-af5e-bbd2b4d0321f","Type":"ContainerStarted","Data":"1755f39cb05921beb8b7cc7fbc1bca7c0c9a368a983aba2bc2def03ce9cd0e4d"} Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.090849 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-697b9444d8-dst4f" event={"ID":"cba9debd-ecde-489e-af5e-bbd2b4d0321f","Type":"ContainerStarted","Data":"e0f5dfd527966df0efeca05041610680ed88ac977dfe03c60c2c39830b0edb39"} Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.091333 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.127630 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-697b9444d8-dst4f" podStartSLOduration=2.127590034 podStartE2EDuration="2.127590034s" podCreationTimestamp="2025-10-10 08:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:24:25.11778777 +0000 UTC m=+5592.187379091" watchObservedRunningTime="2025-10-10 08:24:25.127590034 +0000 UTC m=+5592.197181315" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.243920 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.316908 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd46fcbfc-k2sk6"] Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.317386 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" podUID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerName="dnsmasq-dns" containerID="cri-o://b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f" gracePeriod=10 Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.355783 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.355855 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.811499 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.910283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-dns-svc\") pod \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.910324 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-sb\") pod \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.910445 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-config\") pod \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.910485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8fsc\" (UniqueName: \"kubernetes.io/projected/cc8b109a-f085-43d8-b43a-d4e196ce7dae-kube-api-access-l8fsc\") pod \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.910524 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-nb\") pod \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\" (UID: \"cc8b109a-f085-43d8-b43a-d4e196ce7dae\") " Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.916865 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc8b109a-f085-43d8-b43a-d4e196ce7dae-kube-api-access-l8fsc" (OuterVolumeSpecName: "kube-api-access-l8fsc") pod "cc8b109a-f085-43d8-b43a-d4e196ce7dae" (UID: "cc8b109a-f085-43d8-b43a-d4e196ce7dae"). InnerVolumeSpecName "kube-api-access-l8fsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.955782 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc8b109a-f085-43d8-b43a-d4e196ce7dae" (UID: "cc8b109a-f085-43d8-b43a-d4e196ce7dae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.957028 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc8b109a-f085-43d8-b43a-d4e196ce7dae" (UID: "cc8b109a-f085-43d8-b43a-d4e196ce7dae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.964894 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-config" (OuterVolumeSpecName: "config") pod "cc8b109a-f085-43d8-b43a-d4e196ce7dae" (UID: "cc8b109a-f085-43d8-b43a-d4e196ce7dae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:24:25 crc kubenswrapper[4732]: I1010 08:24:25.995322 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc8b109a-f085-43d8-b43a-d4e196ce7dae" (UID: "cc8b109a-f085-43d8-b43a-d4e196ce7dae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.013106 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.013144 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8fsc\" (UniqueName: \"kubernetes.io/projected/cc8b109a-f085-43d8-b43a-d4e196ce7dae-kube-api-access-l8fsc\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.013159 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.013172 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.013185 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc8b109a-f085-43d8-b43a-d4e196ce7dae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.103540 4732 generic.go:334] "Generic (PLEG): container finished" podID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerID="b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f" exitCode=0 Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.103607 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.103607 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" event={"ID":"cc8b109a-f085-43d8-b43a-d4e196ce7dae","Type":"ContainerDied","Data":"b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f"} Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.104000 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd46fcbfc-k2sk6" event={"ID":"cc8b109a-f085-43d8-b43a-d4e196ce7dae","Type":"ContainerDied","Data":"6da00f4fd9b5f509c075c73da7f31dfbfe3a81b1d67a613962ae89dc52a6fcf4"} Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.104034 4732 scope.go:117] "RemoveContainer" containerID="b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.108162 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.135631 4732 scope.go:117] "RemoveContainer" containerID="0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.148250 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd46fcbfc-k2sk6"] Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.156788 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bd46fcbfc-k2sk6"] Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.157162 4732 scope.go:117] "RemoveContainer" containerID="b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f" Oct 10 08:24:26 crc kubenswrapper[4732]: E1010 08:24:26.157586 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f\": container with ID starting with b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f not found: ID does not exist" containerID="b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.157628 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f"} err="failed to get container status \"b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f\": rpc error: code = NotFound desc = could not find container \"b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f\": container with ID starting with b15757c2232e396e582d41b21d9cff01102401ce6ca3af1bc6cf4c6e89036b3f not found: ID does not exist" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.157677 4732 scope.go:117] "RemoveContainer" containerID="0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450" Oct 10 08:24:26 crc kubenswrapper[4732]: E1010 08:24:26.158234 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450\": container with ID starting with 0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450 not found: ID does not exist" containerID="0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450" Oct 10 08:24:26 crc kubenswrapper[4732]: I1010 08:24:26.158270 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450"} err="failed to get container status \"0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450\": rpc error: code = NotFound desc = could not find container \"0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450\": container with ID starting with 0324f2a220085bcec1a166383189ec40e32c80869e993e064ce95a8226613450 not found: ID does not exist" Oct 10 08:24:27 crc kubenswrapper[4732]: I1010 08:24:27.678894 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" path="/var/lib/kubelet/pods/cc8b109a-f085-43d8-b43a-d4e196ce7dae/volumes" Oct 10 08:24:54 crc kubenswrapper[4732]: I1010 08:24:54.457423 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:54 crc kubenswrapper[4732]: I1010 08:24:54.461987 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-697b9444d8-dst4f" Oct 10 08:24:55 crc kubenswrapper[4732]: I1010 08:24:55.355563 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:24:55 crc kubenswrapper[4732]: I1010 08:24:55.355641 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:24:55 crc kubenswrapper[4732]: I1010 08:24:55.355774 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:24:55 crc kubenswrapper[4732]: I1010 08:24:55.356657 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:24:55 crc kubenswrapper[4732]: I1010 08:24:55.356750 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" gracePeriod=600 Oct 10 08:24:55 crc kubenswrapper[4732]: E1010 08:24:55.491232 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:24:56 crc kubenswrapper[4732]: I1010 08:24:56.450042 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" exitCode=0 Oct 10 08:24:56 crc kubenswrapper[4732]: I1010 08:24:56.450098 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1"} Oct 10 08:24:56 crc kubenswrapper[4732]: I1010 08:24:56.450147 4732 scope.go:117] "RemoveContainer" containerID="7df48300028f267e40178e485796865eb5f10b524ed9fcd0a9aaeef67e08b38f" Oct 10 08:24:56 crc kubenswrapper[4732]: I1010 08:24:56.451118 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:24:56 crc kubenswrapper[4732]: E1010 08:24:56.451489 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:25:08 crc kubenswrapper[4732]: I1010 08:25:08.660438 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:25:08 crc kubenswrapper[4732]: E1010 08:25:08.661297 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.539553 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-t6gsv"] Oct 10 08:25:18 crc kubenswrapper[4732]: E1010 08:25:18.540621 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerName="dnsmasq-dns" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.540640 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerName="dnsmasq-dns" Oct 10 08:25:18 crc kubenswrapper[4732]: E1010 08:25:18.540665 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerName="init" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.540672 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerName="init" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.540926 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc8b109a-f085-43d8-b43a-d4e196ce7dae" containerName="dnsmasq-dns" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.541656 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6gsv" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.549861 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t6gsv"] Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.628356 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xnvl6"] Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.629588 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xnvl6" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.646176 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xnvl6"] Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.674611 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56p5\" (UniqueName: \"kubernetes.io/projected/3c98e014-24c6-4c11-9965-34dca9a8aa12-kube-api-access-k56p5\") pod \"nova-api-db-create-t6gsv\" (UID: \"3c98e014-24c6-4c11-9965-34dca9a8aa12\") " pod="openstack/nova-api-db-create-t6gsv" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.724819 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jxdbl"] Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.726074 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxdbl" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.739163 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jxdbl"] Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.776098 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7dm7\" (UniqueName: \"kubernetes.io/projected/39795c3d-ce50-4e53-befa-12c4619a7e26-kube-api-access-c7dm7\") pod \"nova-cell0-db-create-xnvl6\" (UID: \"39795c3d-ce50-4e53-befa-12c4619a7e26\") " pod="openstack/nova-cell0-db-create-xnvl6" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.776176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k56p5\" (UniqueName: \"kubernetes.io/projected/3c98e014-24c6-4c11-9965-34dca9a8aa12-kube-api-access-k56p5\") pod \"nova-api-db-create-t6gsv\" (UID: \"3c98e014-24c6-4c11-9965-34dca9a8aa12\") " pod="openstack/nova-api-db-create-t6gsv" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.796243 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56p5\" (UniqueName: \"kubernetes.io/projected/3c98e014-24c6-4c11-9965-34dca9a8aa12-kube-api-access-k56p5\") pod \"nova-api-db-create-t6gsv\" (UID: \"3c98e014-24c6-4c11-9965-34dca9a8aa12\") " pod="openstack/nova-api-db-create-t6gsv" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.873706 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6gsv" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.877472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7dm7\" (UniqueName: \"kubernetes.io/projected/39795c3d-ce50-4e53-befa-12c4619a7e26-kube-api-access-c7dm7\") pod \"nova-cell0-db-create-xnvl6\" (UID: \"39795c3d-ce50-4e53-befa-12c4619a7e26\") " pod="openstack/nova-cell0-db-create-xnvl6" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.877576 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlhj4\" (UniqueName: \"kubernetes.io/projected/5adc8e93-5e71-44c1-a74a-45498406543a-kube-api-access-vlhj4\") pod \"nova-cell1-db-create-jxdbl\" (UID: \"5adc8e93-5e71-44c1-a74a-45498406543a\") " pod="openstack/nova-cell1-db-create-jxdbl" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.894999 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7dm7\" (UniqueName: \"kubernetes.io/projected/39795c3d-ce50-4e53-befa-12c4619a7e26-kube-api-access-c7dm7\") pod \"nova-cell0-db-create-xnvl6\" (UID: \"39795c3d-ce50-4e53-befa-12c4619a7e26\") " pod="openstack/nova-cell0-db-create-xnvl6" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.951143 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xnvl6" Oct 10 08:25:18 crc kubenswrapper[4732]: I1010 08:25:18.980322 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlhj4\" (UniqueName: \"kubernetes.io/projected/5adc8e93-5e71-44c1-a74a-45498406543a-kube-api-access-vlhj4\") pod \"nova-cell1-db-create-jxdbl\" (UID: \"5adc8e93-5e71-44c1-a74a-45498406543a\") " pod="openstack/nova-cell1-db-create-jxdbl" Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.003584 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlhj4\" (UniqueName: \"kubernetes.io/projected/5adc8e93-5e71-44c1-a74a-45498406543a-kube-api-access-vlhj4\") pod \"nova-cell1-db-create-jxdbl\" (UID: \"5adc8e93-5e71-44c1-a74a-45498406543a\") " pod="openstack/nova-cell1-db-create-jxdbl" Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.046341 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxdbl" Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.387791 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-t6gsv"] Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.486088 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xnvl6"] Oct 10 08:25:19 crc kubenswrapper[4732]: W1010 08:25:19.487229 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39795c3d_ce50_4e53_befa_12c4619a7e26.slice/crio-9776c2312d1ce82801678195f6414e0f6dfcb8061ea4d704bc28c1438f0b9eb9 WatchSource:0}: Error finding container 9776c2312d1ce82801678195f6414e0f6dfcb8061ea4d704bc28c1438f0b9eb9: Status 404 returned error can't find the container with id 9776c2312d1ce82801678195f6414e0f6dfcb8061ea4d704bc28c1438f0b9eb9 Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.565396 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jxdbl"] Oct 10 08:25:19 crc kubenswrapper[4732]: W1010 08:25:19.570151 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5adc8e93_5e71_44c1_a74a_45498406543a.slice/crio-b3c2de5012683e5e5712df6fa2747c04e67e8b99e5e947dc39d9e04ffb619402 WatchSource:0}: Error finding container b3c2de5012683e5e5712df6fa2747c04e67e8b99e5e947dc39d9e04ffb619402: Status 404 returned error can't find the container with id b3c2de5012683e5e5712df6fa2747c04e67e8b99e5e947dc39d9e04ffb619402 Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.693262 4732 generic.go:334] "Generic (PLEG): container finished" podID="3c98e014-24c6-4c11-9965-34dca9a8aa12" containerID="6388b7d2c21e1bfad4dc15434200ea4a8cf75be618f410d97a0216d579f2b541" exitCode=0 Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.693809 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t6gsv" event={"ID":"3c98e014-24c6-4c11-9965-34dca9a8aa12","Type":"ContainerDied","Data":"6388b7d2c21e1bfad4dc15434200ea4a8cf75be618f410d97a0216d579f2b541"} Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.693842 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t6gsv" event={"ID":"3c98e014-24c6-4c11-9965-34dca9a8aa12","Type":"ContainerStarted","Data":"84192246cfb11deeb1fc2a9e75855bd3311bc8bdc0097d009a5b233c653372f6"} Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.696199 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xnvl6" event={"ID":"39795c3d-ce50-4e53-befa-12c4619a7e26","Type":"ContainerStarted","Data":"9776c2312d1ce82801678195f6414e0f6dfcb8061ea4d704bc28c1438f0b9eb9"} Oct 10 08:25:19 crc kubenswrapper[4732]: I1010 08:25:19.698922 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jxdbl" event={"ID":"5adc8e93-5e71-44c1-a74a-45498406543a","Type":"ContainerStarted","Data":"b3c2de5012683e5e5712df6fa2747c04e67e8b99e5e947dc39d9e04ffb619402"} Oct 10 08:25:20 crc kubenswrapper[4732]: I1010 08:25:20.710397 4732 generic.go:334] "Generic (PLEG): container finished" podID="5adc8e93-5e71-44c1-a74a-45498406543a" containerID="1c673562f6bacf4507a50dfaa1eeabd5c3a2ac05cde1ce7d47a31dcdd9ee8391" exitCode=0 Oct 10 08:25:20 crc kubenswrapper[4732]: I1010 08:25:20.710548 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jxdbl" event={"ID":"5adc8e93-5e71-44c1-a74a-45498406543a","Type":"ContainerDied","Data":"1c673562f6bacf4507a50dfaa1eeabd5c3a2ac05cde1ce7d47a31dcdd9ee8391"} Oct 10 08:25:20 crc kubenswrapper[4732]: I1010 08:25:20.712324 4732 generic.go:334] "Generic (PLEG): container finished" podID="39795c3d-ce50-4e53-befa-12c4619a7e26" containerID="51f41b6697f8b6a0ceb62dd647fa5eb7a5c062a7b355ccda36b137b07f784c75" exitCode=0 Oct 10 08:25:20 crc kubenswrapper[4732]: I1010 08:25:20.712364 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xnvl6" event={"ID":"39795c3d-ce50-4e53-befa-12c4619a7e26","Type":"ContainerDied","Data":"51f41b6697f8b6a0ceb62dd647fa5eb7a5c062a7b355ccda36b137b07f784c75"} Oct 10 08:25:21 crc kubenswrapper[4732]: I1010 08:25:21.038524 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6gsv" Oct 10 08:25:21 crc kubenswrapper[4732]: I1010 08:25:21.238718 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k56p5\" (UniqueName: \"kubernetes.io/projected/3c98e014-24c6-4c11-9965-34dca9a8aa12-kube-api-access-k56p5\") pod \"3c98e014-24c6-4c11-9965-34dca9a8aa12\" (UID: \"3c98e014-24c6-4c11-9965-34dca9a8aa12\") " Oct 10 08:25:21 crc kubenswrapper[4732]: I1010 08:25:21.245275 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c98e014-24c6-4c11-9965-34dca9a8aa12-kube-api-access-k56p5" (OuterVolumeSpecName: "kube-api-access-k56p5") pod "3c98e014-24c6-4c11-9965-34dca9a8aa12" (UID: "3c98e014-24c6-4c11-9965-34dca9a8aa12"). InnerVolumeSpecName "kube-api-access-k56p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:25:21 crc kubenswrapper[4732]: I1010 08:25:21.340971 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k56p5\" (UniqueName: \"kubernetes.io/projected/3c98e014-24c6-4c11-9965-34dca9a8aa12-kube-api-access-k56p5\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:21 crc kubenswrapper[4732]: I1010 08:25:21.722924 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-t6gsv" event={"ID":"3c98e014-24c6-4c11-9965-34dca9a8aa12","Type":"ContainerDied","Data":"84192246cfb11deeb1fc2a9e75855bd3311bc8bdc0097d009a5b233c653372f6"} Oct 10 08:25:21 crc kubenswrapper[4732]: I1010 08:25:21.722972 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84192246cfb11deeb1fc2a9e75855bd3311bc8bdc0097d009a5b233c653372f6" Oct 10 08:25:21 crc kubenswrapper[4732]: I1010 08:25:21.723107 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-t6gsv" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.129792 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xnvl6" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.134398 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxdbl" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.261226 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlhj4\" (UniqueName: \"kubernetes.io/projected/5adc8e93-5e71-44c1-a74a-45498406543a-kube-api-access-vlhj4\") pod \"5adc8e93-5e71-44c1-a74a-45498406543a\" (UID: \"5adc8e93-5e71-44c1-a74a-45498406543a\") " Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.261541 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7dm7\" (UniqueName: \"kubernetes.io/projected/39795c3d-ce50-4e53-befa-12c4619a7e26-kube-api-access-c7dm7\") pod \"39795c3d-ce50-4e53-befa-12c4619a7e26\" (UID: \"39795c3d-ce50-4e53-befa-12c4619a7e26\") " Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.265020 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5adc8e93-5e71-44c1-a74a-45498406543a-kube-api-access-vlhj4" (OuterVolumeSpecName: "kube-api-access-vlhj4") pod "5adc8e93-5e71-44c1-a74a-45498406543a" (UID: "5adc8e93-5e71-44c1-a74a-45498406543a"). InnerVolumeSpecName "kube-api-access-vlhj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.266581 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39795c3d-ce50-4e53-befa-12c4619a7e26-kube-api-access-c7dm7" (OuterVolumeSpecName: "kube-api-access-c7dm7") pod "39795c3d-ce50-4e53-befa-12c4619a7e26" (UID: "39795c3d-ce50-4e53-befa-12c4619a7e26"). InnerVolumeSpecName "kube-api-access-c7dm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.364900 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlhj4\" (UniqueName: \"kubernetes.io/projected/5adc8e93-5e71-44c1-a74a-45498406543a-kube-api-access-vlhj4\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.365206 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7dm7\" (UniqueName: \"kubernetes.io/projected/39795c3d-ce50-4e53-befa-12c4619a7e26-kube-api-access-c7dm7\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.660684 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:25:22 crc kubenswrapper[4732]: E1010 08:25:22.660944 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.745628 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xnvl6" event={"ID":"39795c3d-ce50-4e53-befa-12c4619a7e26","Type":"ContainerDied","Data":"9776c2312d1ce82801678195f6414e0f6dfcb8061ea4d704bc28c1438f0b9eb9"} Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.745676 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9776c2312d1ce82801678195f6414e0f6dfcb8061ea4d704bc28c1438f0b9eb9" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.745645 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xnvl6" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.749670 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jxdbl" event={"ID":"5adc8e93-5e71-44c1-a74a-45498406543a","Type":"ContainerDied","Data":"b3c2de5012683e5e5712df6fa2747c04e67e8b99e5e947dc39d9e04ffb619402"} Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.749739 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3c2de5012683e5e5712df6fa2747c04e67e8b99e5e947dc39d9e04ffb619402" Oct 10 08:25:22 crc kubenswrapper[4732]: I1010 08:25:22.749785 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jxdbl" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.775828 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-737a-account-create-z248t"] Oct 10 08:25:28 crc kubenswrapper[4732]: E1010 08:25:28.777793 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5adc8e93-5e71-44c1-a74a-45498406543a" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.777987 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5adc8e93-5e71-44c1-a74a-45498406543a" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: E1010 08:25:28.778084 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39795c3d-ce50-4e53-befa-12c4619a7e26" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.778155 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="39795c3d-ce50-4e53-befa-12c4619a7e26" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: E1010 08:25:28.778235 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c98e014-24c6-4c11-9965-34dca9a8aa12" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.778306 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c98e014-24c6-4c11-9965-34dca9a8aa12" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.778590 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5adc8e93-5e71-44c1-a74a-45498406543a" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.778718 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c98e014-24c6-4c11-9965-34dca9a8aa12" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.778797 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="39795c3d-ce50-4e53-befa-12c4619a7e26" containerName="mariadb-database-create" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.779616 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-737a-account-create-z248t" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.782551 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.785638 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-737a-account-create-z248t"] Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.885079 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsfp6\" (UniqueName: \"kubernetes.io/projected/d5fba53e-2bc4-4a82-982a-22f39e81a78f-kube-api-access-zsfp6\") pod \"nova-api-737a-account-create-z248t\" (UID: \"d5fba53e-2bc4-4a82-982a-22f39e81a78f\") " pod="openstack/nova-api-737a-account-create-z248t" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.966738 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8ad3-account-create-frqct"] Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.967864 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ad3-account-create-frqct" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.970104 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.977052 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8ad3-account-create-frqct"] Oct 10 08:25:28 crc kubenswrapper[4732]: I1010 08:25:28.987679 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsfp6\" (UniqueName: \"kubernetes.io/projected/d5fba53e-2bc4-4a82-982a-22f39e81a78f-kube-api-access-zsfp6\") pod \"nova-api-737a-account-create-z248t\" (UID: \"d5fba53e-2bc4-4a82-982a-22f39e81a78f\") " pod="openstack/nova-api-737a-account-create-z248t" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.009536 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsfp6\" (UniqueName: \"kubernetes.io/projected/d5fba53e-2bc4-4a82-982a-22f39e81a78f-kube-api-access-zsfp6\") pod \"nova-api-737a-account-create-z248t\" (UID: \"d5fba53e-2bc4-4a82-982a-22f39e81a78f\") " pod="openstack/nova-api-737a-account-create-z248t" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.089650 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd25s\" (UniqueName: \"kubernetes.io/projected/d68ece14-1313-4f06-bce9-a46535685ad4-kube-api-access-rd25s\") pod \"nova-cell0-8ad3-account-create-frqct\" (UID: \"d68ece14-1313-4f06-bce9-a46535685ad4\") " pod="openstack/nova-cell0-8ad3-account-create-frqct" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.107961 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-737a-account-create-z248t" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.187164 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8455-account-create-m7fpc"] Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.197889 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8455-account-create-m7fpc"] Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.197932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd25s\" (UniqueName: \"kubernetes.io/projected/d68ece14-1313-4f06-bce9-a46535685ad4-kube-api-access-rd25s\") pod \"nova-cell0-8ad3-account-create-frqct\" (UID: \"d68ece14-1313-4f06-bce9-a46535685ad4\") " pod="openstack/nova-cell0-8ad3-account-create-frqct" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.198082 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8455-account-create-m7fpc" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.200755 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.225361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd25s\" (UniqueName: \"kubernetes.io/projected/d68ece14-1313-4f06-bce9-a46535685ad4-kube-api-access-rd25s\") pod \"nova-cell0-8ad3-account-create-frqct\" (UID: \"d68ece14-1313-4f06-bce9-a46535685ad4\") " pod="openstack/nova-cell0-8ad3-account-create-frqct" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.283424 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ad3-account-create-frqct" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.300616 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsn6l\" (UniqueName: \"kubernetes.io/projected/655f526b-2365-4ecd-b3d5-7d60beffedf1-kube-api-access-xsn6l\") pod \"nova-cell1-8455-account-create-m7fpc\" (UID: \"655f526b-2365-4ecd-b3d5-7d60beffedf1\") " pod="openstack/nova-cell1-8455-account-create-m7fpc" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.402213 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsn6l\" (UniqueName: \"kubernetes.io/projected/655f526b-2365-4ecd-b3d5-7d60beffedf1-kube-api-access-xsn6l\") pod \"nova-cell1-8455-account-create-m7fpc\" (UID: \"655f526b-2365-4ecd-b3d5-7d60beffedf1\") " pod="openstack/nova-cell1-8455-account-create-m7fpc" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.423536 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsn6l\" (UniqueName: \"kubernetes.io/projected/655f526b-2365-4ecd-b3d5-7d60beffedf1-kube-api-access-xsn6l\") pod \"nova-cell1-8455-account-create-m7fpc\" (UID: \"655f526b-2365-4ecd-b3d5-7d60beffedf1\") " pod="openstack/nova-cell1-8455-account-create-m7fpc" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.576083 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8455-account-create-m7fpc" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.617260 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-737a-account-create-z248t"] Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.715427 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8ad3-account-create-frqct"] Oct 10 08:25:29 crc kubenswrapper[4732]: W1010 08:25:29.716534 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd68ece14_1313_4f06_bce9_a46535685ad4.slice/crio-b165dbdc9494d9a3b82acc85dbbd51156f92a0f021ba6900cc4fee008ba3593d WatchSource:0}: Error finding container b165dbdc9494d9a3b82acc85dbbd51156f92a0f021ba6900cc4fee008ba3593d: Status 404 returned error can't find the container with id b165dbdc9494d9a3b82acc85dbbd51156f92a0f021ba6900cc4fee008ba3593d Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.814611 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-737a-account-create-z248t" event={"ID":"d5fba53e-2bc4-4a82-982a-22f39e81a78f","Type":"ContainerStarted","Data":"b0031338a699ef23aa284fecd5b17da30969f81adc3bdfff92ccd5824a2155ad"} Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.814656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-737a-account-create-z248t" event={"ID":"d5fba53e-2bc4-4a82-982a-22f39e81a78f","Type":"ContainerStarted","Data":"7c6867c8dd2117c5ed142c6a4fdb371234cb83904e5b4bacad663132c1189230"} Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.820789 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8ad3-account-create-frqct" event={"ID":"d68ece14-1313-4f06-bce9-a46535685ad4","Type":"ContainerStarted","Data":"b165dbdc9494d9a3b82acc85dbbd51156f92a0f021ba6900cc4fee008ba3593d"} Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.842443 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-737a-account-create-z248t" podStartSLOduration=1.842421536 podStartE2EDuration="1.842421536s" podCreationTimestamp="2025-10-10 08:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:25:29.842089288 +0000 UTC m=+5656.911680539" watchObservedRunningTime="2025-10-10 08:25:29.842421536 +0000 UTC m=+5656.912012807" Oct 10 08:25:29 crc kubenswrapper[4732]: I1010 08:25:29.999529 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8455-account-create-m7fpc"] Oct 10 08:25:30 crc kubenswrapper[4732]: W1010 08:25:30.004307 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod655f526b_2365_4ecd_b3d5_7d60beffedf1.slice/crio-9337eefdec8b98430e08ed85bd98b1d14d51b8a3b22b3c25d1fc39295d86d6a1 WatchSource:0}: Error finding container 9337eefdec8b98430e08ed85bd98b1d14d51b8a3b22b3c25d1fc39295d86d6a1: Status 404 returned error can't find the container with id 9337eefdec8b98430e08ed85bd98b1d14d51b8a3b22b3c25d1fc39295d86d6a1 Oct 10 08:25:30 crc kubenswrapper[4732]: I1010 08:25:30.827820 4732 generic.go:334] "Generic (PLEG): container finished" podID="d5fba53e-2bc4-4a82-982a-22f39e81a78f" containerID="b0031338a699ef23aa284fecd5b17da30969f81adc3bdfff92ccd5824a2155ad" exitCode=0 Oct 10 08:25:30 crc kubenswrapper[4732]: I1010 08:25:30.827890 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-737a-account-create-z248t" event={"ID":"d5fba53e-2bc4-4a82-982a-22f39e81a78f","Type":"ContainerDied","Data":"b0031338a699ef23aa284fecd5b17da30969f81adc3bdfff92ccd5824a2155ad"} Oct 10 08:25:30 crc kubenswrapper[4732]: I1010 08:25:30.830459 4732 generic.go:334] "Generic (PLEG): container finished" podID="655f526b-2365-4ecd-b3d5-7d60beffedf1" containerID="65e339abb51245eb40d70c18c0e77adabb6cd759059a0fcf4bc70c3fdf875a34" exitCode=0 Oct 10 08:25:30 crc kubenswrapper[4732]: I1010 08:25:30.830552 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8455-account-create-m7fpc" event={"ID":"655f526b-2365-4ecd-b3d5-7d60beffedf1","Type":"ContainerDied","Data":"65e339abb51245eb40d70c18c0e77adabb6cd759059a0fcf4bc70c3fdf875a34"} Oct 10 08:25:30 crc kubenswrapper[4732]: I1010 08:25:30.830616 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8455-account-create-m7fpc" event={"ID":"655f526b-2365-4ecd-b3d5-7d60beffedf1","Type":"ContainerStarted","Data":"9337eefdec8b98430e08ed85bd98b1d14d51b8a3b22b3c25d1fc39295d86d6a1"} Oct 10 08:25:30 crc kubenswrapper[4732]: I1010 08:25:30.832835 4732 generic.go:334] "Generic (PLEG): container finished" podID="d68ece14-1313-4f06-bce9-a46535685ad4" containerID="1326d8d9629b26dfca3f549d282a87e59e32f2e45fe3a13dc5788e695263aa74" exitCode=0 Oct 10 08:25:30 crc kubenswrapper[4732]: I1010 08:25:30.832865 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8ad3-account-create-frqct" event={"ID":"d68ece14-1313-4f06-bce9-a46535685ad4","Type":"ContainerDied","Data":"1326d8d9629b26dfca3f549d282a87e59e32f2e45fe3a13dc5788e695263aa74"} Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.059407 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bt4s6"] Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.072182 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bt4s6"] Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.267372 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ad3-account-create-frqct" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.273052 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8455-account-create-m7fpc" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.279247 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-737a-account-create-z248t" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.370263 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd25s\" (UniqueName: \"kubernetes.io/projected/d68ece14-1313-4f06-bce9-a46535685ad4-kube-api-access-rd25s\") pod \"d68ece14-1313-4f06-bce9-a46535685ad4\" (UID: \"d68ece14-1313-4f06-bce9-a46535685ad4\") " Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.370329 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsfp6\" (UniqueName: \"kubernetes.io/projected/d5fba53e-2bc4-4a82-982a-22f39e81a78f-kube-api-access-zsfp6\") pod \"d5fba53e-2bc4-4a82-982a-22f39e81a78f\" (UID: \"d5fba53e-2bc4-4a82-982a-22f39e81a78f\") " Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.370430 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsn6l\" (UniqueName: \"kubernetes.io/projected/655f526b-2365-4ecd-b3d5-7d60beffedf1-kube-api-access-xsn6l\") pod \"655f526b-2365-4ecd-b3d5-7d60beffedf1\" (UID: \"655f526b-2365-4ecd-b3d5-7d60beffedf1\") " Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.375936 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655f526b-2365-4ecd-b3d5-7d60beffedf1-kube-api-access-xsn6l" (OuterVolumeSpecName: "kube-api-access-xsn6l") pod "655f526b-2365-4ecd-b3d5-7d60beffedf1" (UID: "655f526b-2365-4ecd-b3d5-7d60beffedf1"). InnerVolumeSpecName "kube-api-access-xsn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.376966 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68ece14-1313-4f06-bce9-a46535685ad4-kube-api-access-rd25s" (OuterVolumeSpecName: "kube-api-access-rd25s") pod "d68ece14-1313-4f06-bce9-a46535685ad4" (UID: "d68ece14-1313-4f06-bce9-a46535685ad4"). InnerVolumeSpecName "kube-api-access-rd25s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.378848 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5fba53e-2bc4-4a82-982a-22f39e81a78f-kube-api-access-zsfp6" (OuterVolumeSpecName: "kube-api-access-zsfp6") pod "d5fba53e-2bc4-4a82-982a-22f39e81a78f" (UID: "d5fba53e-2bc4-4a82-982a-22f39e81a78f"). InnerVolumeSpecName "kube-api-access-zsfp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.471889 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd25s\" (UniqueName: \"kubernetes.io/projected/d68ece14-1313-4f06-bce9-a46535685ad4-kube-api-access-rd25s\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.471925 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsfp6\" (UniqueName: \"kubernetes.io/projected/d5fba53e-2bc4-4a82-982a-22f39e81a78f-kube-api-access-zsfp6\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.471934 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsn6l\" (UniqueName: \"kubernetes.io/projected/655f526b-2365-4ecd-b3d5-7d60beffedf1-kube-api-access-xsn6l\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.863526 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8455-account-create-m7fpc" event={"ID":"655f526b-2365-4ecd-b3d5-7d60beffedf1","Type":"ContainerDied","Data":"9337eefdec8b98430e08ed85bd98b1d14d51b8a3b22b3c25d1fc39295d86d6a1"} Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.863595 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9337eefdec8b98430e08ed85bd98b1d14d51b8a3b22b3c25d1fc39295d86d6a1" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.863844 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8455-account-create-m7fpc" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.866920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8ad3-account-create-frqct" event={"ID":"d68ece14-1313-4f06-bce9-a46535685ad4","Type":"ContainerDied","Data":"b165dbdc9494d9a3b82acc85dbbd51156f92a0f021ba6900cc4fee008ba3593d"} Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.866971 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b165dbdc9494d9a3b82acc85dbbd51156f92a0f021ba6900cc4fee008ba3593d" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.866932 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8ad3-account-create-frqct" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.873598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-737a-account-create-z248t" event={"ID":"d5fba53e-2bc4-4a82-982a-22f39e81a78f","Type":"ContainerDied","Data":"7c6867c8dd2117c5ed142c6a4fdb371234cb83904e5b4bacad663132c1189230"} Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.873654 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6867c8dd2117c5ed142c6a4fdb371234cb83904e5b4bacad663132c1189230" Oct 10 08:25:32 crc kubenswrapper[4732]: I1010 08:25:32.873762 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-737a-account-create-z248t" Oct 10 08:25:33 crc kubenswrapper[4732]: I1010 08:25:33.680076 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5a352b-80c6-4aa1-b84c-f065c6d3ced6" path="/var/lib/kubelet/pods/fb5a352b-80c6-4aa1-b84c-f065c6d3ced6/volumes" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.119249 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7ww"] Oct 10 08:25:34 crc kubenswrapper[4732]: E1010 08:25:34.120057 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655f526b-2365-4ecd-b3d5-7d60beffedf1" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.120076 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="655f526b-2365-4ecd-b3d5-7d60beffedf1" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: E1010 08:25:34.120095 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5fba53e-2bc4-4a82-982a-22f39e81a78f" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.120102 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5fba53e-2bc4-4a82-982a-22f39e81a78f" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: E1010 08:25:34.120119 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68ece14-1313-4f06-bce9-a46535685ad4" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.120126 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68ece14-1313-4f06-bce9-a46535685ad4" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.120300 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68ece14-1313-4f06-bce9-a46535685ad4" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.120315 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="655f526b-2365-4ecd-b3d5-7d60beffedf1" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.120329 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5fba53e-2bc4-4a82-982a-22f39e81a78f" containerName="mariadb-account-create" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.120889 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.125414 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4gp6v" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.127540 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.133644 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7ww"] Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.135979 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.207663 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54j82\" (UniqueName: \"kubernetes.io/projected/cac21c01-cdf9-4adf-a9ae-19219c311c33-kube-api-access-54j82\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.207746 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.207834 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-config-data\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.208105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-scripts\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.311408 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-scripts\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.311610 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54j82\" (UniqueName: \"kubernetes.io/projected/cac21c01-cdf9-4adf-a9ae-19219c311c33-kube-api-access-54j82\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.311653 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.311875 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-config-data\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.319524 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-scripts\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.319741 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-config-data\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.331486 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.338429 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54j82\" (UniqueName: \"kubernetes.io/projected/cac21c01-cdf9-4adf-a9ae-19219c311c33-kube-api-access-54j82\") pod \"nova-cell0-conductor-db-sync-zb7ww\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.445338 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:34 crc kubenswrapper[4732]: I1010 08:25:34.903403 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7ww"] Oct 10 08:25:34 crc kubenswrapper[4732]: W1010 08:25:34.908729 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac21c01_cdf9_4adf_a9ae_19219c311c33.slice/crio-5a6c94e105c30595d411d1ce47c5ef587c8dbe26b70cba8e8b95d5b9bd55ac91 WatchSource:0}: Error finding container 5a6c94e105c30595d411d1ce47c5ef587c8dbe26b70cba8e8b95d5b9bd55ac91: Status 404 returned error can't find the container with id 5a6c94e105c30595d411d1ce47c5ef587c8dbe26b70cba8e8b95d5b9bd55ac91 Oct 10 08:25:35 crc kubenswrapper[4732]: I1010 08:25:35.909028 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" event={"ID":"cac21c01-cdf9-4adf-a9ae-19219c311c33","Type":"ContainerStarted","Data":"5a6c94e105c30595d411d1ce47c5ef587c8dbe26b70cba8e8b95d5b9bd55ac91"} Oct 10 08:25:37 crc kubenswrapper[4732]: I1010 08:25:37.661448 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:25:37 crc kubenswrapper[4732]: E1010 08:25:37.662069 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:25:42 crc kubenswrapper[4732]: I1010 08:25:42.041329 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3a60-account-create-k5pfm"] Oct 10 08:25:42 crc kubenswrapper[4732]: I1010 08:25:42.052221 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3a60-account-create-k5pfm"] Oct 10 08:25:43 crc kubenswrapper[4732]: I1010 08:25:43.676446 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa17fdc-7931-4225-bf16-3da2631cad83" path="/var/lib/kubelet/pods/2fa17fdc-7931-4225-bf16-3da2631cad83/volumes" Oct 10 08:25:43 crc kubenswrapper[4732]: I1010 08:25:43.989853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" event={"ID":"cac21c01-cdf9-4adf-a9ae-19219c311c33","Type":"ContainerStarted","Data":"c52c7a7628f1c740ba714e3f51d19a114b0b3e955dcd6c66e6e68b05dfa54f68"} Oct 10 08:25:44 crc kubenswrapper[4732]: I1010 08:25:44.013210 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" podStartSLOduration=1.694789981 podStartE2EDuration="10.013190158s" podCreationTimestamp="2025-10-10 08:25:34 +0000 UTC" firstStartedPulling="2025-10-10 08:25:34.913957262 +0000 UTC m=+5661.983548503" lastFinishedPulling="2025-10-10 08:25:43.232357399 +0000 UTC m=+5670.301948680" observedRunningTime="2025-10-10 08:25:44.006889009 +0000 UTC m=+5671.076480270" watchObservedRunningTime="2025-10-10 08:25:44.013190158 +0000 UTC m=+5671.082781399" Oct 10 08:25:49 crc kubenswrapper[4732]: I1010 08:25:49.048244 4732 generic.go:334] "Generic (PLEG): container finished" podID="cac21c01-cdf9-4adf-a9ae-19219c311c33" containerID="c52c7a7628f1c740ba714e3f51d19a114b0b3e955dcd6c66e6e68b05dfa54f68" exitCode=0 Oct 10 08:25:49 crc kubenswrapper[4732]: I1010 08:25:49.048342 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" event={"ID":"cac21c01-cdf9-4adf-a9ae-19219c311c33","Type":"ContainerDied","Data":"c52c7a7628f1c740ba714e3f51d19a114b0b3e955dcd6c66e6e68b05dfa54f68"} Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.478267 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.642436 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-scripts\") pod \"cac21c01-cdf9-4adf-a9ae-19219c311c33\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.642578 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54j82\" (UniqueName: \"kubernetes.io/projected/cac21c01-cdf9-4adf-a9ae-19219c311c33-kube-api-access-54j82\") pod \"cac21c01-cdf9-4adf-a9ae-19219c311c33\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.642624 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-combined-ca-bundle\") pod \"cac21c01-cdf9-4adf-a9ae-19219c311c33\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.642816 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-config-data\") pod \"cac21c01-cdf9-4adf-a9ae-19219c311c33\" (UID: \"cac21c01-cdf9-4adf-a9ae-19219c311c33\") " Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.648651 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-scripts" (OuterVolumeSpecName: "scripts") pod "cac21c01-cdf9-4adf-a9ae-19219c311c33" (UID: "cac21c01-cdf9-4adf-a9ae-19219c311c33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.651675 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac21c01-cdf9-4adf-a9ae-19219c311c33-kube-api-access-54j82" (OuterVolumeSpecName: "kube-api-access-54j82") pod "cac21c01-cdf9-4adf-a9ae-19219c311c33" (UID: "cac21c01-cdf9-4adf-a9ae-19219c311c33"). InnerVolumeSpecName "kube-api-access-54j82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.672300 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-config-data" (OuterVolumeSpecName: "config-data") pod "cac21c01-cdf9-4adf-a9ae-19219c311c33" (UID: "cac21c01-cdf9-4adf-a9ae-19219c311c33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.685089 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cac21c01-cdf9-4adf-a9ae-19219c311c33" (UID: "cac21c01-cdf9-4adf-a9ae-19219c311c33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.746102 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.746145 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.746166 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54j82\" (UniqueName: \"kubernetes.io/projected/cac21c01-cdf9-4adf-a9ae-19219c311c33-kube-api-access-54j82\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:50 crc kubenswrapper[4732]: I1010 08:25:50.746185 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cac21c01-cdf9-4adf-a9ae-19219c311c33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.069756 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" event={"ID":"cac21c01-cdf9-4adf-a9ae-19219c311c33","Type":"ContainerDied","Data":"5a6c94e105c30595d411d1ce47c5ef587c8dbe26b70cba8e8b95d5b9bd55ac91"} Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.069967 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6c94e105c30595d411d1ce47c5ef587c8dbe26b70cba8e8b95d5b9bd55ac91" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.069827 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zb7ww" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.151204 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:25:51 crc kubenswrapper[4732]: E1010 08:25:51.151682 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac21c01-cdf9-4adf-a9ae-19219c311c33" containerName="nova-cell0-conductor-db-sync" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.151723 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac21c01-cdf9-4adf-a9ae-19219c311c33" containerName="nova-cell0-conductor-db-sync" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.151965 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac21c01-cdf9-4adf-a9ae-19219c311c33" containerName="nova-cell0-conductor-db-sync" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.152654 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.154758 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4gp6v" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.155303 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.167728 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.254688 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.254798 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qmc\" (UniqueName: \"kubernetes.io/projected/52200daf-815f-40c7-8359-a7140dcd863f-kube-api-access-85qmc\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.254869 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.356962 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.357137 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.357223 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qmc\" (UniqueName: \"kubernetes.io/projected/52200daf-815f-40c7-8359-a7140dcd863f-kube-api-access-85qmc\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.363830 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.365366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.382990 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qmc\" (UniqueName: \"kubernetes.io/projected/52200daf-815f-40c7-8359-a7140dcd863f-kube-api-access-85qmc\") pod \"nova-cell0-conductor-0\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.472044 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:51 crc kubenswrapper[4732]: I1010 08:25:51.965767 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 08:25:52 crc kubenswrapper[4732]: I1010 08:25:52.081464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"52200daf-815f-40c7-8359-a7140dcd863f","Type":"ContainerStarted","Data":"a3001508cbd31368e098021f38536f33a7f688fe8038815b6d2db0b1525adbfc"} Oct 10 08:25:52 crc kubenswrapper[4732]: I1010 08:25:52.662171 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:25:52 crc kubenswrapper[4732]: E1010 08:25:52.662416 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:25:53 crc kubenswrapper[4732]: I1010 08:25:53.094438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"52200daf-815f-40c7-8359-a7140dcd863f","Type":"ContainerStarted","Data":"13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07"} Oct 10 08:25:53 crc kubenswrapper[4732]: I1010 08:25:53.095023 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 10 08:25:53 crc kubenswrapper[4732]: I1010 08:25:53.118527 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.118496648 podStartE2EDuration="2.118496648s" podCreationTimestamp="2025-10-10 08:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:25:53.109932728 +0000 UTC m=+5680.179523999" watchObservedRunningTime="2025-10-10 08:25:53.118496648 +0000 UTC m=+5680.188087919" Oct 10 08:25:54 crc kubenswrapper[4732]: I1010 08:25:54.035234 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-klcgc"] Oct 10 08:25:54 crc kubenswrapper[4732]: I1010 08:25:54.051030 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-klcgc"] Oct 10 08:25:55 crc kubenswrapper[4732]: I1010 08:25:55.671725 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b02470-428a-42f3-8327-2994102a0c89" path="/var/lib/kubelet/pods/66b02470-428a-42f3-8327-2994102a0c89/volumes" Oct 10 08:26:01 crc kubenswrapper[4732]: I1010 08:26:01.507530 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.066592 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mzxtv"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.068107 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.073951 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.073999 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.082523 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mzxtv"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.156853 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-config-data\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.156967 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfnd\" (UniqueName: \"kubernetes.io/projected/6702bc05-24bf-45f4-96b2-994a19c2a40e-kube-api-access-scfnd\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.157021 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-scripts\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.157112 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.193159 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.194628 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.205146 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.216226 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.259084 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-scripts\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.259190 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.259246 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-config-data\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.259311 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scfnd\" (UniqueName: \"kubernetes.io/projected/6702bc05-24bf-45f4-96b2-994a19c2a40e-kube-api-access-scfnd\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.267716 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-config-data\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.268555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.271886 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.273086 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.273304 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-scripts\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.278061 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.285570 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfnd\" (UniqueName: \"kubernetes.io/projected/6702bc05-24bf-45f4-96b2-994a19c2a40e-kube-api-access-scfnd\") pod \"nova-cell0-cell-mapping-mzxtv\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.291225 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.311359 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.312974 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.316675 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.343764 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.360335 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-config-data\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.360373 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-config-data\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.360429 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.360451 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749xq\" (UniqueName: \"kubernetes.io/projected/2f93119c-52c0-49b7-bff9-1833aa4bb249-kube-api-access-749xq\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.360477 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f93119c-52c0-49b7-bff9-1833aa4bb249-logs\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.360514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxbw2\" (UniqueName: \"kubernetes.io/projected/64d8b8f1-2e34-4eb9-8568-93aac31b406b-kube-api-access-jxbw2\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.360548 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.402141 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.403788 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.407009 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.414388 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.422452 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462771 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-config-data\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462822 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-config-data\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462870 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxk4\" (UniqueName: \"kubernetes.io/projected/74749092-ceea-41e0-848c-ee1227f2bcb5-kube-api-access-8nxk4\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462906 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462925 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-749xq\" (UniqueName: \"kubernetes.io/projected/2f93119c-52c0-49b7-bff9-1833aa4bb249-kube-api-access-749xq\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462952 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f93119c-52c0-49b7-bff9-1833aa4bb249-logs\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462969 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.462987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.463019 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxbw2\" (UniqueName: \"kubernetes.io/projected/64d8b8f1-2e34-4eb9-8568-93aac31b406b-kube-api-access-jxbw2\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.463055 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.467009 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f93119c-52c0-49b7-bff9-1833aa4bb249-logs\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.468449 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.468515 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-config-data\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.470280 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.471379 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-config-data\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.489683 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-749xq\" (UniqueName: \"kubernetes.io/projected/2f93119c-52c0-49b7-bff9-1833aa4bb249-kube-api-access-749xq\") pod \"nova-api-0\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.507258 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxbw2\" (UniqueName: \"kubernetes.io/projected/64d8b8f1-2e34-4eb9-8568-93aac31b406b-kube-api-access-jxbw2\") pod \"nova-scheduler-0\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.520126 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.554363 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b995548b9-n79sv"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.555875 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.570054 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-config-data\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.570105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.570145 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2r94\" (UniqueName: \"kubernetes.io/projected/dd847629-30ff-4579-a536-0cfd77a3b888-kube-api-access-t2r94\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.570203 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd847629-30ff-4579-a536-0cfd77a3b888-logs\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.570240 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxk4\" (UniqueName: \"kubernetes.io/projected/74749092-ceea-41e0-848c-ee1227f2bcb5-kube-api-access-8nxk4\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.570287 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.570305 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.571258 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b995548b9-n79sv"] Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.578590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.604964 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.647494 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxk4\" (UniqueName: \"kubernetes.io/projected/74749092-ceea-41e0-848c-ee1227f2bcb5-kube-api-access-8nxk4\") pod \"nova-cell1-novncproxy-0\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672177 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-config\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672240 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-config-data\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672271 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672294 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-sb\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2r94\" (UniqueName: \"kubernetes.io/projected/dd847629-30ff-4579-a536-0cfd77a3b888-kube-api-access-t2r94\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672334 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8r54\" (UniqueName: \"kubernetes.io/projected/0413febe-2fe9-4567-a937-4a24918cac93-kube-api-access-w8r54\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672363 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-nb\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672387 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-dns-svc\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672414 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd847629-30ff-4579-a536-0cfd77a3b888-logs\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.672853 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd847629-30ff-4579-a536-0cfd77a3b888-logs\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.678778 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-config-data\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.689371 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.697543 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.703582 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2r94\" (UniqueName: \"kubernetes.io/projected/dd847629-30ff-4579-a536-0cfd77a3b888-kube-api-access-t2r94\") pod \"nova-metadata-0\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.729600 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.747091 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.774099 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-config\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.774196 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-sb\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.774232 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8r54\" (UniqueName: \"kubernetes.io/projected/0413febe-2fe9-4567-a937-4a24918cac93-kube-api-access-w8r54\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.774270 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-nb\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.774302 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-dns-svc\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.775062 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-config\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.775602 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-sb\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.775891 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-dns-svc\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.776977 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-nb\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.793497 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8r54\" (UniqueName: \"kubernetes.io/projected/0413febe-2fe9-4567-a937-4a24918cac93-kube-api-access-w8r54\") pod \"dnsmasq-dns-6b995548b9-n79sv\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:02 crc kubenswrapper[4732]: I1010 08:26:02.970836 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.157200 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:03 crc kubenswrapper[4732]: W1010 08:26:03.160152 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f93119c_52c0_49b7_bff9_1833aa4bb249.slice/crio-2a5d959d7bd27c9807d9e005c326fba693889ac4c0d491371fa78a5f347327d1 WatchSource:0}: Error finding container 2a5d959d7bd27c9807d9e005c326fba693889ac4c0d491371fa78a5f347327d1: Status 404 returned error can't find the container with id 2a5d959d7bd27c9807d9e005c326fba693889ac4c0d491371fa78a5f347327d1 Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.165090 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.193460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f93119c-52c0-49b7-bff9-1833aa4bb249","Type":"ContainerStarted","Data":"2a5d959d7bd27c9807d9e005c326fba693889ac4c0d491371fa78a5f347327d1"} Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.261326 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mzxtv"] Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.273492 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xl7lk"] Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.275360 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.278311 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.278561 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.303139 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xl7lk"] Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.387549 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.394996 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-config-data\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.395032 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w7md\" (UniqueName: \"kubernetes.io/projected/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-kube-api-access-6w7md\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.395060 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.395258 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-scripts\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: W1010 08:26:03.403250 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64d8b8f1_2e34_4eb9_8568_93aac31b406b.slice/crio-b68f13edb955abc02470b7ccabf0504b60e0d217261eb5be6f0e171d5dc9544c WatchSource:0}: Error finding container b68f13edb955abc02470b7ccabf0504b60e0d217261eb5be6f0e171d5dc9544c: Status 404 returned error can't find the container with id b68f13edb955abc02470b7ccabf0504b60e0d217261eb5be6f0e171d5dc9544c Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.471066 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:03 crc kubenswrapper[4732]: W1010 08:26:03.483815 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd847629_30ff_4579_a536_0cfd77a3b888.slice/crio-8d3273b61f8ab85611f85ccbe475bbb9a6c09df2e6b2c6b4026c9bfe13277497 WatchSource:0}: Error finding container 8d3273b61f8ab85611f85ccbe475bbb9a6c09df2e6b2c6b4026c9bfe13277497: Status 404 returned error can't find the container with id 8d3273b61f8ab85611f85ccbe475bbb9a6c09df2e6b2c6b4026c9bfe13277497 Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.497446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-config-data\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.497508 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w7md\" (UniqueName: \"kubernetes.io/projected/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-kube-api-access-6w7md\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.497538 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.497603 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-scripts\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.503121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-scripts\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.503229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.504311 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-config-data\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.526612 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w7md\" (UniqueName: \"kubernetes.io/projected/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-kube-api-access-6w7md\") pod \"nova-cell1-conductor-db-sync-xl7lk\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.570335 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:03 crc kubenswrapper[4732]: W1010 08:26:03.582318 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74749092_ceea_41e0_848c_ee1227f2bcb5.slice/crio-27a52052540b3c0f12aaff0219be60abd327b5fd5ef3ebbf9bb010ddca17f5c1 WatchSource:0}: Error finding container 27a52052540b3c0f12aaff0219be60abd327b5fd5ef3ebbf9bb010ddca17f5c1: Status 404 returned error can't find the container with id 27a52052540b3c0f12aaff0219be60abd327b5fd5ef3ebbf9bb010ddca17f5c1 Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.636194 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b995548b9-n79sv"] Oct 10 08:26:03 crc kubenswrapper[4732]: W1010 08:26:03.643323 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0413febe_2fe9_4567_a937_4a24918cac93.slice/crio-048fb19035e4175da793f9e482c97ca630fd2834dbf9d31a34375f69b800d8b3 WatchSource:0}: Error finding container 048fb19035e4175da793f9e482c97ca630fd2834dbf9d31a34375f69b800d8b3: Status 404 returned error can't find the container with id 048fb19035e4175da793f9e482c97ca630fd2834dbf9d31a34375f69b800d8b3 Oct 10 08:26:03 crc kubenswrapper[4732]: I1010 08:26:03.805323 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.127762 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xl7lk"] Oct 10 08:26:04 crc kubenswrapper[4732]: W1010 08:26:04.142016 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50c57446_a9d6_4657_991f_7c4bd7cf0aa8.slice/crio-6f06f4d61f54ff3144fb81a74f2dab1d0cf7a65104b78d736845743f1dbfbf2b WatchSource:0}: Error finding container 6f06f4d61f54ff3144fb81a74f2dab1d0cf7a65104b78d736845743f1dbfbf2b: Status 404 returned error can't find the container with id 6f06f4d61f54ff3144fb81a74f2dab1d0cf7a65104b78d736845743f1dbfbf2b Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.205295 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74749092-ceea-41e0-848c-ee1227f2bcb5","Type":"ContainerStarted","Data":"27a52052540b3c0f12aaff0219be60abd327b5fd5ef3ebbf9bb010ddca17f5c1"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.207662 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mzxtv" event={"ID":"6702bc05-24bf-45f4-96b2-994a19c2a40e","Type":"ContainerStarted","Data":"b8dd24412467df82499f51adb57d84b91a04b84116b652042d87019d87b25cd6"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.207744 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mzxtv" event={"ID":"6702bc05-24bf-45f4-96b2-994a19c2a40e","Type":"ContainerStarted","Data":"f495e695f5fdccf33fea2fffa732d34e2869212e41a3e65011d9eff7004c4182"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.210172 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd847629-30ff-4579-a536-0cfd77a3b888","Type":"ContainerStarted","Data":"8d3273b61f8ab85611f85ccbe475bbb9a6c09df2e6b2c6b4026c9bfe13277497"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.212266 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" event={"ID":"50c57446-a9d6-4657-991f-7c4bd7cf0aa8","Type":"ContainerStarted","Data":"6f06f4d61f54ff3144fb81a74f2dab1d0cf7a65104b78d736845743f1dbfbf2b"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.214382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64d8b8f1-2e34-4eb9-8568-93aac31b406b","Type":"ContainerStarted","Data":"b68f13edb955abc02470b7ccabf0504b60e0d217261eb5be6f0e171d5dc9544c"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.225964 4732 generic.go:334] "Generic (PLEG): container finished" podID="0413febe-2fe9-4567-a937-4a24918cac93" containerID="e73f07b03688f6e641c4a4b31296aa8f8ea841289cfddb1cf02292012c0d0749" exitCode=0 Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.226187 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" event={"ID":"0413febe-2fe9-4567-a937-4a24918cac93","Type":"ContainerDied","Data":"e73f07b03688f6e641c4a4b31296aa8f8ea841289cfddb1cf02292012c0d0749"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.226217 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" event={"ID":"0413febe-2fe9-4567-a937-4a24918cac93","Type":"ContainerStarted","Data":"048fb19035e4175da793f9e482c97ca630fd2834dbf9d31a34375f69b800d8b3"} Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.237094 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mzxtv" podStartSLOduration=2.237073029 podStartE2EDuration="2.237073029s" podCreationTimestamp="2025-10-10 08:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:04.227170794 +0000 UTC m=+5691.296762035" watchObservedRunningTime="2025-10-10 08:26:04.237073029 +0000 UTC m=+5691.306664270" Oct 10 08:26:04 crc kubenswrapper[4732]: I1010 08:26:04.660686 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:26:04 crc kubenswrapper[4732]: E1010 08:26:04.661234 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:26:05 crc kubenswrapper[4732]: I1010 08:26:05.237699 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" event={"ID":"50c57446-a9d6-4657-991f-7c4bd7cf0aa8","Type":"ContainerStarted","Data":"1dedbc4f9d946fb0f20248d0d35cbe013c515f1d2d08a68b8f07bde1c1d53841"} Oct 10 08:26:05 crc kubenswrapper[4732]: I1010 08:26:05.240131 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" event={"ID":"0413febe-2fe9-4567-a937-4a24918cac93","Type":"ContainerStarted","Data":"75b4a4eb6e9af1da58b504db37375eda1c9467b6c6bd15631730c57bb1029754"} Oct 10 08:26:05 crc kubenswrapper[4732]: I1010 08:26:05.266471 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" podStartSLOduration=2.266451723 podStartE2EDuration="2.266451723s" podCreationTimestamp="2025-10-10 08:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:05.254734719 +0000 UTC m=+5692.324325970" watchObservedRunningTime="2025-10-10 08:26:05.266451723 +0000 UTC m=+5692.336042964" Oct 10 08:26:05 crc kubenswrapper[4732]: I1010 08:26:05.278949 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" podStartSLOduration=3.278930877 podStartE2EDuration="3.278930877s" podCreationTimestamp="2025-10-10 08:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:05.278098824 +0000 UTC m=+5692.347690075" watchObservedRunningTime="2025-10-10 08:26:05.278930877 +0000 UTC m=+5692.348522118" Oct 10 08:26:06 crc kubenswrapper[4732]: I1010 08:26:06.247923 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:06 crc kubenswrapper[4732]: I1010 08:26:06.757706 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:06 crc kubenswrapper[4732]: I1010 08:26:06.778762 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.259265 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f93119c-52c0-49b7-bff9-1833aa4bb249","Type":"ContainerStarted","Data":"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea"} Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.260232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f93119c-52c0-49b7-bff9-1833aa4bb249","Type":"ContainerStarted","Data":"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb"} Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.267173 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="74749092-ceea-41e0-848c-ee1227f2bcb5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c8a825e728af6abc8dd7867631f639827e3c69cab4bb505fcd946900f82ab7ff" gracePeriod=30 Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.267314 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74749092-ceea-41e0-848c-ee1227f2bcb5","Type":"ContainerStarted","Data":"c8a825e728af6abc8dd7867631f639827e3c69cab4bb505fcd946900f82ab7ff"} Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.274335 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd847629-30ff-4579-a536-0cfd77a3b888","Type":"ContainerStarted","Data":"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed"} Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.274387 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd847629-30ff-4579-a536-0cfd77a3b888","Type":"ContainerStarted","Data":"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6"} Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.274525 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-log" containerID="cri-o://2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6" gracePeriod=30 Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.274613 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-metadata" containerID="cri-o://131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed" gracePeriod=30 Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.282108 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64d8b8f1-2e34-4eb9-8568-93aac31b406b","Type":"ContainerStarted","Data":"7065c2b7322bb18d5ee8470c3d83a0a9d8f1a30a87b10444edefbe180e1bc723"} Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.293909 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.974850247 podStartE2EDuration="5.293889683s" podCreationTimestamp="2025-10-10 08:26:02 +0000 UTC" firstStartedPulling="2025-10-10 08:26:03.164893278 +0000 UTC m=+5690.234484519" lastFinishedPulling="2025-10-10 08:26:06.483932714 +0000 UTC m=+5693.553523955" observedRunningTime="2025-10-10 08:26:07.282444287 +0000 UTC m=+5694.352035548" watchObservedRunningTime="2025-10-10 08:26:07.293889683 +0000 UTC m=+5694.363480924" Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.314217 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.331492347 podStartE2EDuration="5.314193987s" podCreationTimestamp="2025-10-10 08:26:02 +0000 UTC" firstStartedPulling="2025-10-10 08:26:03.487130237 +0000 UTC m=+5690.556721478" lastFinishedPulling="2025-10-10 08:26:06.469831867 +0000 UTC m=+5693.539423118" observedRunningTime="2025-10-10 08:26:07.30196696 +0000 UTC m=+5694.371558201" watchObservedRunningTime="2025-10-10 08:26:07.314193987 +0000 UTC m=+5694.383785238" Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.331261 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.254039593 podStartE2EDuration="5.331243104s" podCreationTimestamp="2025-10-10 08:26:02 +0000 UTC" firstStartedPulling="2025-10-10 08:26:03.406336213 +0000 UTC m=+5690.475927454" lastFinishedPulling="2025-10-10 08:26:06.483539724 +0000 UTC m=+5693.553130965" observedRunningTime="2025-10-10 08:26:07.316631512 +0000 UTC m=+5694.386222753" watchObservedRunningTime="2025-10-10 08:26:07.331243104 +0000 UTC m=+5694.400834335" Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.341388 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.4410312100000002 podStartE2EDuration="5.341370685s" podCreationTimestamp="2025-10-10 08:26:02 +0000 UTC" firstStartedPulling="2025-10-10 08:26:03.591149122 +0000 UTC m=+5690.660740363" lastFinishedPulling="2025-10-10 08:26:06.491488607 +0000 UTC m=+5693.561079838" observedRunningTime="2025-10-10 08:26:07.334182852 +0000 UTC m=+5694.403774103" watchObservedRunningTime="2025-10-10 08:26:07.341370685 +0000 UTC m=+5694.410961926" Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.698416 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.731207 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.747750 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:26:07 crc kubenswrapper[4732]: I1010 08:26:07.747795 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.042385 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lzxlw"] Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.050730 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lzxlw"] Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.207596 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.290263 4732 generic.go:334] "Generic (PLEG): container finished" podID="dd847629-30ff-4579-a536-0cfd77a3b888" containerID="131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed" exitCode=0 Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.290327 4732 generic.go:334] "Generic (PLEG): container finished" podID="dd847629-30ff-4579-a536-0cfd77a3b888" containerID="2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6" exitCode=143 Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.290331 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.290376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd847629-30ff-4579-a536-0cfd77a3b888","Type":"ContainerDied","Data":"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed"} Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.290425 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd847629-30ff-4579-a536-0cfd77a3b888","Type":"ContainerDied","Data":"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6"} Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.290451 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dd847629-30ff-4579-a536-0cfd77a3b888","Type":"ContainerDied","Data":"8d3273b61f8ab85611f85ccbe475bbb9a6c09df2e6b2c6b4026c9bfe13277497"} Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.290474 4732 scope.go:117] "RemoveContainer" containerID="131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.292574 4732 generic.go:334] "Generic (PLEG): container finished" podID="50c57446-a9d6-4657-991f-7c4bd7cf0aa8" containerID="1dedbc4f9d946fb0f20248d0d35cbe013c515f1d2d08a68b8f07bde1c1d53841" exitCode=0 Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.292626 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" event={"ID":"50c57446-a9d6-4657-991f-7c4bd7cf0aa8","Type":"ContainerDied","Data":"1dedbc4f9d946fb0f20248d0d35cbe013c515f1d2d08a68b8f07bde1c1d53841"} Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.319005 4732 scope.go:117] "RemoveContainer" containerID="2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.327370 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2r94\" (UniqueName: \"kubernetes.io/projected/dd847629-30ff-4579-a536-0cfd77a3b888-kube-api-access-t2r94\") pod \"dd847629-30ff-4579-a536-0cfd77a3b888\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.327440 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd847629-30ff-4579-a536-0cfd77a3b888-logs\") pod \"dd847629-30ff-4579-a536-0cfd77a3b888\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.327475 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-config-data\") pod \"dd847629-30ff-4579-a536-0cfd77a3b888\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.327627 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-combined-ca-bundle\") pod \"dd847629-30ff-4579-a536-0cfd77a3b888\" (UID: \"dd847629-30ff-4579-a536-0cfd77a3b888\") " Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.328672 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd847629-30ff-4579-a536-0cfd77a3b888-logs" (OuterVolumeSpecName: "logs") pod "dd847629-30ff-4579-a536-0cfd77a3b888" (UID: "dd847629-30ff-4579-a536-0cfd77a3b888"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.338751 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd847629-30ff-4579-a536-0cfd77a3b888-kube-api-access-t2r94" (OuterVolumeSpecName: "kube-api-access-t2r94") pod "dd847629-30ff-4579-a536-0cfd77a3b888" (UID: "dd847629-30ff-4579-a536-0cfd77a3b888"). InnerVolumeSpecName "kube-api-access-t2r94". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.357481 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd847629-30ff-4579-a536-0cfd77a3b888" (UID: "dd847629-30ff-4579-a536-0cfd77a3b888"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.360183 4732 scope.go:117] "RemoveContainer" containerID="131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed" Oct 10 08:26:08 crc kubenswrapper[4732]: E1010 08:26:08.360639 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed\": container with ID starting with 131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed not found: ID does not exist" containerID="131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.360672 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed"} err="failed to get container status \"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed\": rpc error: code = NotFound desc = could not find container \"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed\": container with ID starting with 131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed not found: ID does not exist" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.360756 4732 scope.go:117] "RemoveContainer" containerID="2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6" Oct 10 08:26:08 crc kubenswrapper[4732]: E1010 08:26:08.361288 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6\": container with ID starting with 2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6 not found: ID does not exist" containerID="2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.361305 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6"} err="failed to get container status \"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6\": rpc error: code = NotFound desc = could not find container \"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6\": container with ID starting with 2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6 not found: ID does not exist" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.361319 4732 scope.go:117] "RemoveContainer" containerID="131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.361801 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed"} err="failed to get container status \"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed\": rpc error: code = NotFound desc = could not find container \"131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed\": container with ID starting with 131d441aa1a2e1d2fc391fc3e5a37f5164c732b9527762d0e7f95a80c8ea73ed not found: ID does not exist" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.361825 4732 scope.go:117] "RemoveContainer" containerID="2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.362076 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6"} err="failed to get container status \"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6\": rpc error: code = NotFound desc = could not find container \"2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6\": container with ID starting with 2160090ce07350a4645f04bf64055b0b6cea02fd15d6614bbfe754dce1a544c6 not found: ID does not exist" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.373270 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-config-data" (OuterVolumeSpecName: "config-data") pod "dd847629-30ff-4579-a536-0cfd77a3b888" (UID: "dd847629-30ff-4579-a536-0cfd77a3b888"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.430480 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.430515 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2r94\" (UniqueName: \"kubernetes.io/projected/dd847629-30ff-4579-a536-0cfd77a3b888-kube-api-access-t2r94\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.430527 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd847629-30ff-4579-a536-0cfd77a3b888-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.430538 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd847629-30ff-4579-a536-0cfd77a3b888-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.635105 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.641544 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.650937 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:08 crc kubenswrapper[4732]: E1010 08:26:08.651382 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-log" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.651408 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-log" Oct 10 08:26:08 crc kubenswrapper[4732]: E1010 08:26:08.651417 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-metadata" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.651424 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-metadata" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.651588 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-metadata" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.651621 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" containerName="nova-metadata-log" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.652568 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.654957 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.655226 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.702240 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.736723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-logs\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.736939 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56bnn\" (UniqueName: \"kubernetes.io/projected/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-kube-api-access-56bnn\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.736974 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.736994 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.737050 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-config-data\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.839198 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-config-data\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.839308 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-logs\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.839417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56bnn\" (UniqueName: \"kubernetes.io/projected/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-kube-api-access-56bnn\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.839442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.839459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.841245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-logs\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.845039 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.845345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-config-data\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.845632 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.859956 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56bnn\" (UniqueName: \"kubernetes.io/projected/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-kube-api-access-56bnn\") pod \"nova-metadata-0\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " pod="openstack/nova-metadata-0" Oct 10 08:26:08 crc kubenswrapper[4732]: I1010 08:26:08.970392 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.323059 4732 generic.go:334] "Generic (PLEG): container finished" podID="6702bc05-24bf-45f4-96b2-994a19c2a40e" containerID="b8dd24412467df82499f51adb57d84b91a04b84116b652042d87019d87b25cd6" exitCode=0 Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.324763 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mzxtv" event={"ID":"6702bc05-24bf-45f4-96b2-994a19c2a40e","Type":"ContainerDied","Data":"b8dd24412467df82499f51adb57d84b91a04b84116b652042d87019d87b25cd6"} Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.465182 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:09 crc kubenswrapper[4732]: W1010 08:26:09.478517 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacd0f4c7_dfa1_4f13_b0ba_0e0ed15964f6.slice/crio-70b41591787edbef24dc7f92b24fbb80d53f8e5c86459864dc1f3ff0100e83cb WatchSource:0}: Error finding container 70b41591787edbef24dc7f92b24fbb80d53f8e5c86459864dc1f3ff0100e83cb: Status 404 returned error can't find the container with id 70b41591787edbef24dc7f92b24fbb80d53f8e5c86459864dc1f3ff0100e83cb Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.644429 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.677013 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8872e039-09c6-47bc-8ce5-1c512f861997" path="/var/lib/kubelet/pods/8872e039-09c6-47bc-8ce5-1c512f861997/volumes" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.677636 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd847629-30ff-4579-a536-0cfd77a3b888" path="/var/lib/kubelet/pods/dd847629-30ff-4579-a536-0cfd77a3b888/volumes" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.762355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w7md\" (UniqueName: \"kubernetes.io/projected/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-kube-api-access-6w7md\") pod \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.762488 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-combined-ca-bundle\") pod \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.762570 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-scripts\") pod \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.762594 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-config-data\") pod \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\" (UID: \"50c57446-a9d6-4657-991f-7c4bd7cf0aa8\") " Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.765654 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-scripts" (OuterVolumeSpecName: "scripts") pod "50c57446-a9d6-4657-991f-7c4bd7cf0aa8" (UID: "50c57446-a9d6-4657-991f-7c4bd7cf0aa8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.766583 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-kube-api-access-6w7md" (OuterVolumeSpecName: "kube-api-access-6w7md") pod "50c57446-a9d6-4657-991f-7c4bd7cf0aa8" (UID: "50c57446-a9d6-4657-991f-7c4bd7cf0aa8"). InnerVolumeSpecName "kube-api-access-6w7md". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.786632 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50c57446-a9d6-4657-991f-7c4bd7cf0aa8" (UID: "50c57446-a9d6-4657-991f-7c4bd7cf0aa8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.797053 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-config-data" (OuterVolumeSpecName: "config-data") pod "50c57446-a9d6-4657-991f-7c4bd7cf0aa8" (UID: "50c57446-a9d6-4657-991f-7c4bd7cf0aa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.865016 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.865047 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.865059 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w7md\" (UniqueName: \"kubernetes.io/projected/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-kube-api-access-6w7md\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:09 crc kubenswrapper[4732]: I1010 08:26:09.865070 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c57446-a9d6-4657-991f-7c4bd7cf0aa8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.343383 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6","Type":"ContainerStarted","Data":"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c"} Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.343897 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6","Type":"ContainerStarted","Data":"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e"} Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.343912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6","Type":"ContainerStarted","Data":"70b41591787edbef24dc7f92b24fbb80d53f8e5c86459864dc1f3ff0100e83cb"} Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.348385 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" event={"ID":"50c57446-a9d6-4657-991f-7c4bd7cf0aa8","Type":"ContainerDied","Data":"6f06f4d61f54ff3144fb81a74f2dab1d0cf7a65104b78d736845743f1dbfbf2b"} Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.348437 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f06f4d61f54ff3144fb81a74f2dab1d0cf7a65104b78d736845743f1dbfbf2b" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.350757 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xl7lk" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.377365 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.377347502 podStartE2EDuration="2.377347502s" podCreationTimestamp="2025-10-10 08:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:10.37355357 +0000 UTC m=+5697.443144851" watchObservedRunningTime="2025-10-10 08:26:10.377347502 +0000 UTC m=+5697.446938763" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.423345 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:26:10 crc kubenswrapper[4732]: E1010 08:26:10.423792 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c57446-a9d6-4657-991f-7c4bd7cf0aa8" containerName="nova-cell1-conductor-db-sync" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.423810 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c57446-a9d6-4657-991f-7c4bd7cf0aa8" containerName="nova-cell1-conductor-db-sync" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.423987 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c57446-a9d6-4657-991f-7c4bd7cf0aa8" containerName="nova-cell1-conductor-db-sync" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.424584 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.428283 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.431146 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.578137 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.578227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/c31709ff-c729-4dbc-a23f-11f545334204-kube-api-access-kfg5c\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.578300 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.680292 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.680512 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/c31709ff-c729-4dbc-a23f-11f545334204-kube-api-access-kfg5c\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.680615 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.686153 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.686375 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.699011 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/c31709ff-c729-4dbc-a23f-11f545334204-kube-api-access-kfg5c\") pod \"nova-cell1-conductor-0\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.753147 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.756255 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.882998 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-scripts\") pod \"6702bc05-24bf-45f4-96b2-994a19c2a40e\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.883253 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-combined-ca-bundle\") pod \"6702bc05-24bf-45f4-96b2-994a19c2a40e\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.884083 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-config-data\") pod \"6702bc05-24bf-45f4-96b2-994a19c2a40e\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.884324 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scfnd\" (UniqueName: \"kubernetes.io/projected/6702bc05-24bf-45f4-96b2-994a19c2a40e-kube-api-access-scfnd\") pod \"6702bc05-24bf-45f4-96b2-994a19c2a40e\" (UID: \"6702bc05-24bf-45f4-96b2-994a19c2a40e\") " Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.886093 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-scripts" (OuterVolumeSpecName: "scripts") pod "6702bc05-24bf-45f4-96b2-994a19c2a40e" (UID: "6702bc05-24bf-45f4-96b2-994a19c2a40e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.888370 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6702bc05-24bf-45f4-96b2-994a19c2a40e-kube-api-access-scfnd" (OuterVolumeSpecName: "kube-api-access-scfnd") pod "6702bc05-24bf-45f4-96b2-994a19c2a40e" (UID: "6702bc05-24bf-45f4-96b2-994a19c2a40e"). InnerVolumeSpecName "kube-api-access-scfnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.916035 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6702bc05-24bf-45f4-96b2-994a19c2a40e" (UID: "6702bc05-24bf-45f4-96b2-994a19c2a40e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.916837 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-config-data" (OuterVolumeSpecName: "config-data") pod "6702bc05-24bf-45f4-96b2-994a19c2a40e" (UID: "6702bc05-24bf-45f4-96b2-994a19c2a40e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.986441 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.986483 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.986497 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6702bc05-24bf-45f4-96b2-994a19c2a40e-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:10 crc kubenswrapper[4732]: I1010 08:26:10.986506 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scfnd\" (UniqueName: \"kubernetes.io/projected/6702bc05-24bf-45f4-96b2-994a19c2a40e-kube-api-access-scfnd\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.242502 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 08:26:11 crc kubenswrapper[4732]: W1010 08:26:11.243072 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc31709ff_c729_4dbc_a23f_11f545334204.slice/crio-7190b578bf25319e3f7632bc8f2aad0d4600ea5f1087e9ff62b0fe9cf203b6a3 WatchSource:0}: Error finding container 7190b578bf25319e3f7632bc8f2aad0d4600ea5f1087e9ff62b0fe9cf203b6a3: Status 404 returned error can't find the container with id 7190b578bf25319e3f7632bc8f2aad0d4600ea5f1087e9ff62b0fe9cf203b6a3 Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.378082 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mzxtv" Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.378242 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mzxtv" event={"ID":"6702bc05-24bf-45f4-96b2-994a19c2a40e","Type":"ContainerDied","Data":"f495e695f5fdccf33fea2fffa732d34e2869212e41a3e65011d9eff7004c4182"} Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.378281 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f495e695f5fdccf33fea2fffa732d34e2869212e41a3e65011d9eff7004c4182" Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.380260 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c31709ff-c729-4dbc-a23f-11f545334204","Type":"ContainerStarted","Data":"7190b578bf25319e3f7632bc8f2aad0d4600ea5f1087e9ff62b0fe9cf203b6a3"} Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.542789 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.543056 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="64d8b8f1-2e34-4eb9-8568-93aac31b406b" containerName="nova-scheduler-scheduler" containerID="cri-o://7065c2b7322bb18d5ee8470c3d83a0a9d8f1a30a87b10444edefbe180e1bc723" gracePeriod=30 Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.562737 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.563380 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-log" containerID="cri-o://250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb" gracePeriod=30 Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.563617 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-api" containerID="cri-o://a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea" gracePeriod=30 Oct 10 08:26:11 crc kubenswrapper[4732]: I1010 08:26:11.579944 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.284137 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.403223 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c31709ff-c729-4dbc-a23f-11f545334204","Type":"ContainerStarted","Data":"390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68"} Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.403942 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.408003 4732 generic.go:334] "Generic (PLEG): container finished" podID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerID="a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea" exitCode=0 Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.408044 4732 generic.go:334] "Generic (PLEG): container finished" podID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerID="250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb" exitCode=143 Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.408288 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-log" containerID="cri-o://1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e" gracePeriod=30 Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.409170 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-metadata" containerID="cri-o://b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c" gracePeriod=30 Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.409222 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.409492 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f93119c-52c0-49b7-bff9-1833aa4bb249","Type":"ContainerDied","Data":"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea"} Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.409529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f93119c-52c0-49b7-bff9-1833aa4bb249","Type":"ContainerDied","Data":"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb"} Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.409552 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2f93119c-52c0-49b7-bff9-1833aa4bb249","Type":"ContainerDied","Data":"2a5d959d7bd27c9807d9e005c326fba693889ac4c0d491371fa78a5f347327d1"} Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.409569 4732 scope.go:117] "RemoveContainer" containerID="a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.418211 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f93119c-52c0-49b7-bff9-1833aa4bb249-logs\") pod \"2f93119c-52c0-49b7-bff9-1833aa4bb249\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.418284 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-749xq\" (UniqueName: \"kubernetes.io/projected/2f93119c-52c0-49b7-bff9-1833aa4bb249-kube-api-access-749xq\") pod \"2f93119c-52c0-49b7-bff9-1833aa4bb249\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.418324 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-config-data\") pod \"2f93119c-52c0-49b7-bff9-1833aa4bb249\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.418436 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-combined-ca-bundle\") pod \"2f93119c-52c0-49b7-bff9-1833aa4bb249\" (UID: \"2f93119c-52c0-49b7-bff9-1833aa4bb249\") " Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.418870 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f93119c-52c0-49b7-bff9-1833aa4bb249-logs" (OuterVolumeSpecName: "logs") pod "2f93119c-52c0-49b7-bff9-1833aa4bb249" (UID: "2f93119c-52c0-49b7-bff9-1833aa4bb249"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.423045 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f93119c-52c0-49b7-bff9-1833aa4bb249-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.429328 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f93119c-52c0-49b7-bff9-1833aa4bb249-kube-api-access-749xq" (OuterVolumeSpecName: "kube-api-access-749xq") pod "2f93119c-52c0-49b7-bff9-1833aa4bb249" (UID: "2f93119c-52c0-49b7-bff9-1833aa4bb249"). InnerVolumeSpecName "kube-api-access-749xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.443500 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.443475628 podStartE2EDuration="2.443475628s" podCreationTimestamp="2025-10-10 08:26:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:12.430519151 +0000 UTC m=+5699.500110392" watchObservedRunningTime="2025-10-10 08:26:12.443475628 +0000 UTC m=+5699.513066879" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.446564 4732 scope.go:117] "RemoveContainer" containerID="250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.456124 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f93119c-52c0-49b7-bff9-1833aa4bb249" (UID: "2f93119c-52c0-49b7-bff9-1833aa4bb249"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.464721 4732 scope.go:117] "RemoveContainer" containerID="a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea" Oct 10 08:26:12 crc kubenswrapper[4732]: E1010 08:26:12.465216 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea\": container with ID starting with a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea not found: ID does not exist" containerID="a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.465313 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea"} err="failed to get container status \"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea\": rpc error: code = NotFound desc = could not find container \"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea\": container with ID starting with a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea not found: ID does not exist" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.465455 4732 scope.go:117] "RemoveContainer" containerID="250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.465721 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-config-data" (OuterVolumeSpecName: "config-data") pod "2f93119c-52c0-49b7-bff9-1833aa4bb249" (UID: "2f93119c-52c0-49b7-bff9-1833aa4bb249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:12 crc kubenswrapper[4732]: E1010 08:26:12.465859 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb\": container with ID starting with 250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb not found: ID does not exist" containerID="250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.465900 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb"} err="failed to get container status \"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb\": rpc error: code = NotFound desc = could not find container \"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb\": container with ID starting with 250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb not found: ID does not exist" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.465931 4732 scope.go:117] "RemoveContainer" containerID="a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.466257 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea"} err="failed to get container status \"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea\": rpc error: code = NotFound desc = could not find container \"a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea\": container with ID starting with a31ddbc8dd94e9085635692a5d9cc9416f7938844bc6e3b5e8b7a218f5f6d7ea not found: ID does not exist" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.466304 4732 scope.go:117] "RemoveContainer" containerID="250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.466574 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb"} err="failed to get container status \"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb\": rpc error: code = NotFound desc = could not find container \"250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb\": container with ID starting with 250617df241fdadce18b827ae7d8e1b1ccc6196aa601d9324390ba937eb3e4eb not found: ID does not exist" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.525069 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-749xq\" (UniqueName: \"kubernetes.io/projected/2f93119c-52c0-49b7-bff9-1833aa4bb249-kube-api-access-749xq\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.525107 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.525123 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f93119c-52c0-49b7-bff9-1833aa4bb249-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.769166 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.793807 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.810433 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:12 crc kubenswrapper[4732]: E1010 08:26:12.811117 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6702bc05-24bf-45f4-96b2-994a19c2a40e" containerName="nova-manage" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.811153 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6702bc05-24bf-45f4-96b2-994a19c2a40e" containerName="nova-manage" Oct 10 08:26:12 crc kubenswrapper[4732]: E1010 08:26:12.811189 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-log" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.811201 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-log" Oct 10 08:26:12 crc kubenswrapper[4732]: E1010 08:26:12.811232 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-api" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.811244 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-api" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.811552 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6702bc05-24bf-45f4-96b2-994a19c2a40e" containerName="nova-manage" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.811598 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-log" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.811628 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" containerName="nova-api-api" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.813792 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.816112 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.831179 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.933003 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.933088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-config-data\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.933193 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg99h\" (UniqueName: \"kubernetes.io/projected/aade023c-02ab-4532-a658-c29cba47a7ec-kube-api-access-xg99h\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.933224 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aade023c-02ab-4532-a658-c29cba47a7ec-logs\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.971845 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:26:12 crc kubenswrapper[4732]: I1010 08:26:12.994749 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.035774 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-config-data\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.036328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg99h\" (UniqueName: \"kubernetes.io/projected/aade023c-02ab-4532-a658-c29cba47a7ec-kube-api-access-xg99h\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.036418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aade023c-02ab-4532-a658-c29cba47a7ec-logs\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.036652 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.036789 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aade023c-02ab-4532-a658-c29cba47a7ec-logs\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.040752 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-config-data\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.040813 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b45676f95-p75gt"] Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.041044 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b45676f95-p75gt" podUID="651a24c2-d598-4be7-82e7-676ae6360537" containerName="dnsmasq-dns" containerID="cri-o://5d40840329fc7c583e9aad24c61f27064823d79014a39fe4313e9dc62d092abf" gracePeriod=10 Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.041604 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.075287 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg99h\" (UniqueName: \"kubernetes.io/projected/aade023c-02ab-4532-a658-c29cba47a7ec-kube-api-access-xg99h\") pod \"nova-api-0\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.130386 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.138526 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-combined-ca-bundle\") pod \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.138617 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-config-data\") pod \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.138747 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56bnn\" (UniqueName: \"kubernetes.io/projected/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-kube-api-access-56bnn\") pod \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.138802 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-nova-metadata-tls-certs\") pod \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.138892 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-logs\") pod \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\" (UID: \"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.139734 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-logs" (OuterVolumeSpecName: "logs") pod "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" (UID: "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.150161 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-kube-api-access-56bnn" (OuterVolumeSpecName: "kube-api-access-56bnn") pod "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" (UID: "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6"). InnerVolumeSpecName "kube-api-access-56bnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.171977 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" (UID: "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.172035 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-config-data" (OuterVolumeSpecName: "config-data") pod "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" (UID: "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.199280 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" (UID: "acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.242433 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56bnn\" (UniqueName: \"kubernetes.io/projected/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-kube-api-access-56bnn\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.242465 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.242475 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.242486 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.242495 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.426533 4732 generic.go:334] "Generic (PLEG): container finished" podID="651a24c2-d598-4be7-82e7-676ae6360537" containerID="5d40840329fc7c583e9aad24c61f27064823d79014a39fe4313e9dc62d092abf" exitCode=0 Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.426585 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b45676f95-p75gt" event={"ID":"651a24c2-d598-4be7-82e7-676ae6360537","Type":"ContainerDied","Data":"5d40840329fc7c583e9aad24c61f27064823d79014a39fe4313e9dc62d092abf"} Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.428069 4732 generic.go:334] "Generic (PLEG): container finished" podID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerID="b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c" exitCode=0 Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.428084 4732 generic.go:334] "Generic (PLEG): container finished" podID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerID="1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e" exitCode=143 Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.428946 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6","Type":"ContainerDied","Data":"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c"} Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.429015 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6","Type":"ContainerDied","Data":"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e"} Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.429030 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6","Type":"ContainerDied","Data":"70b41591787edbef24dc7f92b24fbb80d53f8e5c86459864dc1f3ff0100e83cb"} Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.429023 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.429063 4732 scope.go:117] "RemoveContainer" containerID="b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.461182 4732 scope.go:117] "RemoveContainer" containerID="1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.479550 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.491484 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.497207 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.526878 4732 scope.go:117] "RemoveContainer" containerID="b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c" Oct 10 08:26:13 crc kubenswrapper[4732]: E1010 08:26:13.534169 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c\": container with ID starting with b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c not found: ID does not exist" containerID="b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.534210 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c"} err="failed to get container status \"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c\": rpc error: code = NotFound desc = could not find container \"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c\": container with ID starting with b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c not found: ID does not exist" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.534235 4732 scope.go:117] "RemoveContainer" containerID="1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e" Oct 10 08:26:13 crc kubenswrapper[4732]: E1010 08:26:13.534489 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e\": container with ID starting with 1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e not found: ID does not exist" containerID="1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.534505 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e"} err="failed to get container status \"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e\": rpc error: code = NotFound desc = could not find container \"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e\": container with ID starting with 1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e not found: ID does not exist" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.534530 4732 scope.go:117] "RemoveContainer" containerID="b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.534719 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c"} err="failed to get container status \"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c\": rpc error: code = NotFound desc = could not find container \"b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c\": container with ID starting with b8067d8b8645d19a6d6e7fa1b0cd601901862bccdf7ea634d6b5ccef2f6f9e7c not found: ID does not exist" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.534734 4732 scope.go:117] "RemoveContainer" containerID="1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.534900 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e"} err="failed to get container status \"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e\": rpc error: code = NotFound desc = could not find container \"1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e\": container with ID starting with 1103ce376df8857936f3e2a21df0466b2ed9191b55f0a125ff9ddf667e8ff46e not found: ID does not exist" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.537872 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:13 crc kubenswrapper[4732]: E1010 08:26:13.538380 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-metadata" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.538398 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-metadata" Oct 10 08:26:13 crc kubenswrapper[4732]: E1010 08:26:13.538415 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-log" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.538424 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-log" Oct 10 08:26:13 crc kubenswrapper[4732]: E1010 08:26:13.538451 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651a24c2-d598-4be7-82e7-676ae6360537" containerName="init" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.538460 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="651a24c2-d598-4be7-82e7-676ae6360537" containerName="init" Oct 10 08:26:13 crc kubenswrapper[4732]: E1010 08:26:13.538490 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651a24c2-d598-4be7-82e7-676ae6360537" containerName="dnsmasq-dns" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.538498 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="651a24c2-d598-4be7-82e7-676ae6360537" containerName="dnsmasq-dns" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.538753 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-metadata" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.538784 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" containerName="nova-metadata-log" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.538808 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="651a24c2-d598-4be7-82e7-676ae6360537" containerName="dnsmasq-dns" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.539945 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.541981 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.543406 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.544892 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.650345 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf2j5\" (UniqueName: \"kubernetes.io/projected/651a24c2-d598-4be7-82e7-676ae6360537-kube-api-access-sf2j5\") pod \"651a24c2-d598-4be7-82e7-676ae6360537\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.650578 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-nb\") pod \"651a24c2-d598-4be7-82e7-676ae6360537\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.650606 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-dns-svc\") pod \"651a24c2-d598-4be7-82e7-676ae6360537\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.650629 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-sb\") pod \"651a24c2-d598-4be7-82e7-676ae6360537\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.650666 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-config\") pod \"651a24c2-d598-4be7-82e7-676ae6360537\" (UID: \"651a24c2-d598-4be7-82e7-676ae6360537\") " Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.651008 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35fb4d88-d33b-43ef-8a71-0b60e2576d33-logs\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.651048 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz2ln\" (UniqueName: \"kubernetes.io/projected/35fb4d88-d33b-43ef-8a71-0b60e2576d33-kube-api-access-vz2ln\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.651088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-config-data\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.651109 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.651131 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.656843 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651a24c2-d598-4be7-82e7-676ae6360537-kube-api-access-sf2j5" (OuterVolumeSpecName: "kube-api-access-sf2j5") pod "651a24c2-d598-4be7-82e7-676ae6360537" (UID: "651a24c2-d598-4be7-82e7-676ae6360537"). InnerVolumeSpecName "kube-api-access-sf2j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: W1010 08:26:13.677405 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaade023c_02ab_4532_a658_c29cba47a7ec.slice/crio-4b5b4eef5ecfd68a55bcd856be4ccf2d724440857871aac093417d1d63b772e6 WatchSource:0}: Error finding container 4b5b4eef5ecfd68a55bcd856be4ccf2d724440857871aac093417d1d63b772e6: Status 404 returned error can't find the container with id 4b5b4eef5ecfd68a55bcd856be4ccf2d724440857871aac093417d1d63b772e6 Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.686817 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f93119c-52c0-49b7-bff9-1833aa4bb249" path="/var/lib/kubelet/pods/2f93119c-52c0-49b7-bff9-1833aa4bb249/volumes" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.687941 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6" path="/var/lib/kubelet/pods/acd0f4c7-dfa1-4f13-b0ba-0e0ed15964f6/volumes" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.720752 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "651a24c2-d598-4be7-82e7-676ae6360537" (UID: "651a24c2-d598-4be7-82e7-676ae6360537"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.729470 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "651a24c2-d598-4be7-82e7-676ae6360537" (UID: "651a24c2-d598-4be7-82e7-676ae6360537"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.731718 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-config" (OuterVolumeSpecName: "config") pod "651a24c2-d598-4be7-82e7-676ae6360537" (UID: "651a24c2-d598-4be7-82e7-676ae6360537"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.748165 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "651a24c2-d598-4be7-82e7-676ae6360537" (UID: "651a24c2-d598-4be7-82e7-676ae6360537"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.754789 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35fb4d88-d33b-43ef-8a71-0b60e2576d33-logs\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.754897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz2ln\" (UniqueName: \"kubernetes.io/projected/35fb4d88-d33b-43ef-8a71-0b60e2576d33-kube-api-access-vz2ln\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.754988 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-config-data\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755278 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755299 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755311 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755323 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/651a24c2-d598-4be7-82e7-676ae6360537-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755334 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf2j5\" (UniqueName: \"kubernetes.io/projected/651a24c2-d598-4be7-82e7-676ae6360537-kube-api-access-sf2j5\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.755955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35fb4d88-d33b-43ef-8a71-0b60e2576d33-logs\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.757735 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.759750 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.760156 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.768627 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.770674 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-config-data\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.774820 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz2ln\" (UniqueName: \"kubernetes.io/projected/35fb4d88-d33b-43ef-8a71-0b60e2576d33-kube-api-access-vz2ln\") pod \"nova-metadata-0\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " pod="openstack/nova-metadata-0" Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.860999 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:13 crc kubenswrapper[4732]: I1010 08:26:13.888566 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.332584 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.441678 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aade023c-02ab-4532-a658-c29cba47a7ec","Type":"ContainerStarted","Data":"6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4"} Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.441855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aade023c-02ab-4532-a658-c29cba47a7ec","Type":"ContainerStarted","Data":"98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03"} Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.441920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aade023c-02ab-4532-a658-c29cba47a7ec","Type":"ContainerStarted","Data":"4b5b4eef5ecfd68a55bcd856be4ccf2d724440857871aac093417d1d63b772e6"} Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.444145 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35fb4d88-d33b-43ef-8a71-0b60e2576d33","Type":"ContainerStarted","Data":"c319ac427c2760a5347af405ee6658749c9c0fd4ffbb156ee72eb5778a6996d5"} Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.446995 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b45676f95-p75gt" event={"ID":"651a24c2-d598-4be7-82e7-676ae6360537","Type":"ContainerDied","Data":"af74bdb254bf640a59c0a6a6729670299e0a66206aa81b5b24f806808609a148"} Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.447039 4732 scope.go:117] "RemoveContainer" containerID="5d40840329fc7c583e9aad24c61f27064823d79014a39fe4313e9dc62d092abf" Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.447060 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b45676f95-p75gt" Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.465665 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.465646717 podStartE2EDuration="2.465646717s" podCreationTimestamp="2025-10-10 08:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:14.456337737 +0000 UTC m=+5701.525928978" watchObservedRunningTime="2025-10-10 08:26:14.465646717 +0000 UTC m=+5701.535237988" Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.470185 4732 scope.go:117] "RemoveContainer" containerID="ee89d2f7cf7fc80918fd18cc58f60aca68859a37842864f5aa0235db3b2dab69" Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.485908 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b45676f95-p75gt"] Oct 10 08:26:14 crc kubenswrapper[4732]: I1010 08:26:14.493938 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b45676f95-p75gt"] Oct 10 08:26:15 crc kubenswrapper[4732]: I1010 08:26:15.461408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35fb4d88-d33b-43ef-8a71-0b60e2576d33","Type":"ContainerStarted","Data":"9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392"} Oct 10 08:26:15 crc kubenswrapper[4732]: I1010 08:26:15.462808 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35fb4d88-d33b-43ef-8a71-0b60e2576d33","Type":"ContainerStarted","Data":"c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236"} Oct 10 08:26:15 crc kubenswrapper[4732]: I1010 08:26:15.485381 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.485339211 podStartE2EDuration="2.485339211s" podCreationTimestamp="2025-10-10 08:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:15.485220998 +0000 UTC m=+5702.554812259" watchObservedRunningTime="2025-10-10 08:26:15.485339211 +0000 UTC m=+5702.554930462" Oct 10 08:26:15 crc kubenswrapper[4732]: I1010 08:26:15.673543 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651a24c2-d598-4be7-82e7-676ae6360537" path="/var/lib/kubelet/pods/651a24c2-d598-4be7-82e7-676ae6360537/volumes" Oct 10 08:26:18 crc kubenswrapper[4732]: I1010 08:26:18.661582 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:26:18 crc kubenswrapper[4732]: E1010 08:26:18.662565 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:26:18 crc kubenswrapper[4732]: I1010 08:26:18.889851 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:26:18 crc kubenswrapper[4732]: I1010 08:26:18.889912 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:26:20 crc kubenswrapper[4732]: I1010 08:26:20.787590 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 10 08:26:23 crc kubenswrapper[4732]: I1010 08:26:23.132843 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:26:23 crc kubenswrapper[4732]: I1010 08:26:23.133186 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:26:23 crc kubenswrapper[4732]: I1010 08:26:23.889799 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:26:23 crc kubenswrapper[4732]: I1010 08:26:23.889862 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:26:24 crc kubenswrapper[4732]: I1010 08:26:24.213972 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:24 crc kubenswrapper[4732]: I1010 08:26:24.214363 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:24 crc kubenswrapper[4732]: I1010 08:26:24.908975 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:24 crc kubenswrapper[4732]: I1010 08:26:24.908985 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:33 crc kubenswrapper[4732]: I1010 08:26:33.521192 4732 scope.go:117] "RemoveContainer" containerID="2e64371bce8a8dbe6d35b45260ed30f9714a72ef683d4a641af2b0375f67e291" Oct 10 08:26:33 crc kubenswrapper[4732]: I1010 08:26:33.554634 4732 scope.go:117] "RemoveContainer" containerID="f459d4958359919152e0b6dffe9e375e60b549fc946215270aae98f9f46ee5fd" Oct 10 08:26:33 crc kubenswrapper[4732]: I1010 08:26:33.611447 4732 scope.go:117] "RemoveContainer" containerID="7026f96cdd6e572a52269789f1415e148f0c260105369d185998641f19a91abf" Oct 10 08:26:33 crc kubenswrapper[4732]: I1010 08:26:33.668377 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:26:33 crc kubenswrapper[4732]: E1010 08:26:33.668649 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:26:33 crc kubenswrapper[4732]: I1010 08:26:33.675426 4732 scope.go:117] "RemoveContainer" containerID="f5f1ae71b717cdc730140beaf1ca65d2ddcf46f608c3d80a7944271cab646f4b" Oct 10 08:26:34 crc kubenswrapper[4732]: I1010 08:26:34.173084 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:34 crc kubenswrapper[4732]: I1010 08:26:34.214957 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:34 crc kubenswrapper[4732]: I1010 08:26:34.897938 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:34 crc kubenswrapper[4732]: I1010 08:26:34.897940 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.679272 4732 generic.go:334] "Generic (PLEG): container finished" podID="74749092-ceea-41e0-848c-ee1227f2bcb5" containerID="c8a825e728af6abc8dd7867631f639827e3c69cab4bb505fcd946900f82ab7ff" exitCode=137 Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.679577 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74749092-ceea-41e0-848c-ee1227f2bcb5","Type":"ContainerDied","Data":"c8a825e728af6abc8dd7867631f639827e3c69cab4bb505fcd946900f82ab7ff"} Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.679624 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74749092-ceea-41e0-848c-ee1227f2bcb5","Type":"ContainerDied","Data":"27a52052540b3c0f12aaff0219be60abd327b5fd5ef3ebbf9bb010ddca17f5c1"} Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.679635 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27a52052540b3c0f12aaff0219be60abd327b5fd5ef3ebbf9bb010ddca17f5c1" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.702677 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.879568 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-config-data\") pod \"74749092-ceea-41e0-848c-ee1227f2bcb5\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.879710 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-combined-ca-bundle\") pod \"74749092-ceea-41e0-848c-ee1227f2bcb5\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.879775 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nxk4\" (UniqueName: \"kubernetes.io/projected/74749092-ceea-41e0-848c-ee1227f2bcb5-kube-api-access-8nxk4\") pod \"74749092-ceea-41e0-848c-ee1227f2bcb5\" (UID: \"74749092-ceea-41e0-848c-ee1227f2bcb5\") " Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.886272 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74749092-ceea-41e0-848c-ee1227f2bcb5-kube-api-access-8nxk4" (OuterVolumeSpecName: "kube-api-access-8nxk4") pod "74749092-ceea-41e0-848c-ee1227f2bcb5" (UID: "74749092-ceea-41e0-848c-ee1227f2bcb5"). InnerVolumeSpecName "kube-api-access-8nxk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.906967 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74749092-ceea-41e0-848c-ee1227f2bcb5" (UID: "74749092-ceea-41e0-848c-ee1227f2bcb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.911395 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-config-data" (OuterVolumeSpecName: "config-data") pod "74749092-ceea-41e0-848c-ee1227f2bcb5" (UID: "74749092-ceea-41e0-848c-ee1227f2bcb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.982104 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.982134 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nxk4\" (UniqueName: \"kubernetes.io/projected/74749092-ceea-41e0-848c-ee1227f2bcb5-kube-api-access-8nxk4\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:37 crc kubenswrapper[4732]: I1010 08:26:37.982144 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74749092-ceea-41e0-848c-ee1227f2bcb5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.687213 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.742294 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.750051 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.778195 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:38 crc kubenswrapper[4732]: E1010 08:26:38.778627 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74749092-ceea-41e0-848c-ee1227f2bcb5" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.778653 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="74749092-ceea-41e0-848c-ee1227f2bcb5" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.779267 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="74749092-ceea-41e0-848c-ee1227f2bcb5" containerName="nova-cell1-novncproxy-novncproxy" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.780005 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.785456 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.785602 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.785620 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.798529 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.899886 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kdwx\" (UniqueName: \"kubernetes.io/projected/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-kube-api-access-7kdwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.899938 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.899975 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.900030 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:38 crc kubenswrapper[4732]: I1010 08:26:38.900081 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.001816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.002140 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kdwx\" (UniqueName: \"kubernetes.io/projected/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-kube-api-access-7kdwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.002171 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.002206 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.002242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.005812 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.006492 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.007608 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.008535 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.020148 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kdwx\" (UniqueName: \"kubernetes.io/projected/63c5250c-8cab-4d8c-a3c5-e36be4ec6528-kube-api-access-7kdwx\") pod \"nova-cell1-novncproxy-0\" (UID: \"63c5250c-8cab-4d8c-a3c5-e36be4ec6528\") " pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.113266 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.594837 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.672155 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74749092-ceea-41e0-848c-ee1227f2bcb5" path="/var/lib/kubelet/pods/74749092-ceea-41e0-848c-ee1227f2bcb5/volumes" Oct 10 08:26:39 crc kubenswrapper[4732]: I1010 08:26:39.697444 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63c5250c-8cab-4d8c-a3c5-e36be4ec6528","Type":"ContainerStarted","Data":"89dc01d1ac840e49d83e74941caf6bfdd75711ed06c758bbc5bb775b2ff7a148"} Oct 10 08:26:40 crc kubenswrapper[4732]: I1010 08:26:40.711154 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"63c5250c-8cab-4d8c-a3c5-e36be4ec6528","Type":"ContainerStarted","Data":"9cc8a063666596719f922634b85c41ba2c6155522664fc34cace9107b6e4c173"} Oct 10 08:26:40 crc kubenswrapper[4732]: I1010 08:26:40.735164 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.735142573 podStartE2EDuration="2.735142573s" podCreationTimestamp="2025-10-10 08:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:40.730239912 +0000 UTC m=+5727.799831193" watchObservedRunningTime="2025-10-10 08:26:40.735142573 +0000 UTC m=+5727.804733814" Oct 10 08:26:41 crc kubenswrapper[4732]: I1010 08:26:41.726948 4732 generic.go:334] "Generic (PLEG): container finished" podID="64d8b8f1-2e34-4eb9-8568-93aac31b406b" containerID="7065c2b7322bb18d5ee8470c3d83a0a9d8f1a30a87b10444edefbe180e1bc723" exitCode=137 Oct 10 08:26:41 crc kubenswrapper[4732]: I1010 08:26:41.727029 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64d8b8f1-2e34-4eb9-8568-93aac31b406b","Type":"ContainerDied","Data":"7065c2b7322bb18d5ee8470c3d83a0a9d8f1a30a87b10444edefbe180e1bc723"} Oct 10 08:26:41 crc kubenswrapper[4732]: E1010 08:26:41.809854 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64d8b8f1_2e34_4eb9_8568_93aac31b406b.slice/crio-conmon-7065c2b7322bb18d5ee8470c3d83a0a9d8f1a30a87b10444edefbe180e1bc723.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.495750 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.604864 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-combined-ca-bundle\") pod \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.605004 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-config-data\") pod \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.605127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxbw2\" (UniqueName: \"kubernetes.io/projected/64d8b8f1-2e34-4eb9-8568-93aac31b406b-kube-api-access-jxbw2\") pod \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\" (UID: \"64d8b8f1-2e34-4eb9-8568-93aac31b406b\") " Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.609951 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d8b8f1-2e34-4eb9-8568-93aac31b406b-kube-api-access-jxbw2" (OuterVolumeSpecName: "kube-api-access-jxbw2") pod "64d8b8f1-2e34-4eb9-8568-93aac31b406b" (UID: "64d8b8f1-2e34-4eb9-8568-93aac31b406b"). InnerVolumeSpecName "kube-api-access-jxbw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.641464 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64d8b8f1-2e34-4eb9-8568-93aac31b406b" (UID: "64d8b8f1-2e34-4eb9-8568-93aac31b406b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.648484 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-config-data" (OuterVolumeSpecName: "config-data") pod "64d8b8f1-2e34-4eb9-8568-93aac31b406b" (UID: "64d8b8f1-2e34-4eb9-8568-93aac31b406b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.707653 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.707711 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64d8b8f1-2e34-4eb9-8568-93aac31b406b-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.707724 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxbw2\" (UniqueName: \"kubernetes.io/projected/64d8b8f1-2e34-4eb9-8568-93aac31b406b-kube-api-access-jxbw2\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.739435 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"64d8b8f1-2e34-4eb9-8568-93aac31b406b","Type":"ContainerDied","Data":"b68f13edb955abc02470b7ccabf0504b60e0d217261eb5be6f0e171d5dc9544c"} Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.739503 4732 scope.go:117] "RemoveContainer" containerID="7065c2b7322bb18d5ee8470c3d83a0a9d8f1a30a87b10444edefbe180e1bc723" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.739522 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.784751 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.800853 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.833684 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:42 crc kubenswrapper[4732]: E1010 08:26:42.834173 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d8b8f1-2e34-4eb9-8568-93aac31b406b" containerName="nova-scheduler-scheduler" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.834194 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d8b8f1-2e34-4eb9-8568-93aac31b406b" containerName="nova-scheduler-scheduler" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.834468 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d8b8f1-2e34-4eb9-8568-93aac31b406b" containerName="nova-scheduler-scheduler" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.835329 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.838394 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 08:26:42 crc kubenswrapper[4732]: I1010 08:26:42.844088 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.013719 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgx5\" (UniqueName: \"kubernetes.io/projected/569bf592-bd57-481d-9e3f-c4e909a44bf3-kube-api-access-ktgx5\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.014085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.014518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-config-data\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.116269 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-config-data\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.116467 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgx5\" (UniqueName: \"kubernetes.io/projected/569bf592-bd57-481d-9e3f-c4e909a44bf3-kube-api-access-ktgx5\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.116558 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.124205 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-config-data\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.124541 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.132004 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.132077 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.143499 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgx5\" (UniqueName: \"kubernetes.io/projected/569bf592-bd57-481d-9e3f-c4e909a44bf3-kube-api-access-ktgx5\") pod \"nova-scheduler-0\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.154864 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.673991 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d8b8f1-2e34-4eb9-8568-93aac31b406b" path="/var/lib/kubelet/pods/64d8b8f1-2e34-4eb9-8568-93aac31b406b/volumes" Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.683773 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:43 crc kubenswrapper[4732]: I1010 08:26:43.750544 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"569bf592-bd57-481d-9e3f-c4e909a44bf3","Type":"ContainerStarted","Data":"75ed557bbf76a1306f269a33210a22eef738dd8ad387a86e73a0e70dadab27f1"} Oct 10 08:26:44 crc kubenswrapper[4732]: I1010 08:26:44.114318 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:44 crc kubenswrapper[4732]: I1010 08:26:44.172999 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:44 crc kubenswrapper[4732]: I1010 08:26:44.214014 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:44 crc kubenswrapper[4732]: I1010 08:26:44.769922 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"569bf592-bd57-481d-9e3f-c4e909a44bf3","Type":"ContainerStarted","Data":"fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b"} Oct 10 08:26:44 crc kubenswrapper[4732]: I1010 08:26:44.792247 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.792231002 podStartE2EDuration="2.792231002s" podCreationTimestamp="2025-10-10 08:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:44.791466032 +0000 UTC m=+5731.861057273" watchObservedRunningTime="2025-10-10 08:26:44.792231002 +0000 UTC m=+5731.861822243" Oct 10 08:26:44 crc kubenswrapper[4732]: I1010 08:26:44.897842 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:44 crc kubenswrapper[4732]: I1010 08:26:44.897878 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:46 crc kubenswrapper[4732]: I1010 08:26:46.661249 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:26:46 crc kubenswrapper[4732]: E1010 08:26:46.662278 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:26:48 crc kubenswrapper[4732]: I1010 08:26:48.155950 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 08:26:49 crc kubenswrapper[4732]: I1010 08:26:49.113891 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:49 crc kubenswrapper[4732]: I1010 08:26:49.140237 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:49 crc kubenswrapper[4732]: I1010 08:26:49.854431 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.108429 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jwcrg"] Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.109558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.111778 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.112461 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.127778 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jwcrg"] Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.162035 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.162434 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbn4c\" (UniqueName: \"kubernetes.io/projected/c5f73311-ec18-40b6-8a6f-00c817d7f036-kube-api-access-lbn4c\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.162590 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-scripts\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.162876 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-config-data\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.264082 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-scripts\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.264175 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-config-data\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.264221 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.264307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbn4c\" (UniqueName: \"kubernetes.io/projected/c5f73311-ec18-40b6-8a6f-00c817d7f036-kube-api-access-lbn4c\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.271185 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.271420 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-config-data\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.280941 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-scripts\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.283311 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbn4c\" (UniqueName: \"kubernetes.io/projected/c5f73311-ec18-40b6-8a6f-00c817d7f036-kube-api-access-lbn4c\") pod \"nova-cell1-cell-mapping-jwcrg\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.437640 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:50 crc kubenswrapper[4732]: I1010 08:26:50.951729 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jwcrg"] Oct 10 08:26:50 crc kubenswrapper[4732]: W1010 08:26:50.959011 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5f73311_ec18_40b6_8a6f_00c817d7f036.slice/crio-cee031f54a2161c9ac184e30c27f94eb98432e7bafa3604e2e915fbba809d5d0 WatchSource:0}: Error finding container cee031f54a2161c9ac184e30c27f94eb98432e7bafa3604e2e915fbba809d5d0: Status 404 returned error can't find the container with id cee031f54a2161c9ac184e30c27f94eb98432e7bafa3604e2e915fbba809d5d0 Oct 10 08:26:51 crc kubenswrapper[4732]: I1010 08:26:51.863874 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jwcrg" event={"ID":"c5f73311-ec18-40b6-8a6f-00c817d7f036","Type":"ContainerStarted","Data":"a75bb423b7f2e4c9f9707efddc211ef545dbca4282beddf20c0d424987ba9c06"} Oct 10 08:26:51 crc kubenswrapper[4732]: I1010 08:26:51.864496 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jwcrg" event={"ID":"c5f73311-ec18-40b6-8a6f-00c817d7f036","Type":"ContainerStarted","Data":"cee031f54a2161c9ac184e30c27f94eb98432e7bafa3604e2e915fbba809d5d0"} Oct 10 08:26:51 crc kubenswrapper[4732]: I1010 08:26:51.897997 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jwcrg" podStartSLOduration=1.897970887 podStartE2EDuration="1.897970887s" podCreationTimestamp="2025-10-10 08:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:26:51.880196011 +0000 UTC m=+5738.949787302" watchObservedRunningTime="2025-10-10 08:26:51.897970887 +0000 UTC m=+5738.967562158" Oct 10 08:26:53 crc kubenswrapper[4732]: I1010 08:26:53.155503 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 08:26:53 crc kubenswrapper[4732]: I1010 08:26:53.211564 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 08:26:53 crc kubenswrapper[4732]: I1010 08:26:53.931837 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 08:26:54 crc kubenswrapper[4732]: I1010 08:26:54.212971 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:54 crc kubenswrapper[4732]: I1010 08:26:54.213930 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.88:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:54 crc kubenswrapper[4732]: I1010 08:26:54.896836 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:54 crc kubenswrapper[4732]: I1010 08:26:54.896845 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:26:55 crc kubenswrapper[4732]: I1010 08:26:55.911501 4732 generic.go:334] "Generic (PLEG): container finished" podID="c5f73311-ec18-40b6-8a6f-00c817d7f036" containerID="a75bb423b7f2e4c9f9707efddc211ef545dbca4282beddf20c0d424987ba9c06" exitCode=0 Oct 10 08:26:55 crc kubenswrapper[4732]: I1010 08:26:55.911564 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jwcrg" event={"ID":"c5f73311-ec18-40b6-8a6f-00c817d7f036","Type":"ContainerDied","Data":"a75bb423b7f2e4c9f9707efddc211ef545dbca4282beddf20c0d424987ba9c06"} Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.251081 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.402142 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-combined-ca-bundle\") pod \"c5f73311-ec18-40b6-8a6f-00c817d7f036\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.402226 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-scripts\") pod \"c5f73311-ec18-40b6-8a6f-00c817d7f036\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.402368 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-config-data\") pod \"c5f73311-ec18-40b6-8a6f-00c817d7f036\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.402422 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbn4c\" (UniqueName: \"kubernetes.io/projected/c5f73311-ec18-40b6-8a6f-00c817d7f036-kube-api-access-lbn4c\") pod \"c5f73311-ec18-40b6-8a6f-00c817d7f036\" (UID: \"c5f73311-ec18-40b6-8a6f-00c817d7f036\") " Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.408227 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-scripts" (OuterVolumeSpecName: "scripts") pod "c5f73311-ec18-40b6-8a6f-00c817d7f036" (UID: "c5f73311-ec18-40b6-8a6f-00c817d7f036"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.409873 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f73311-ec18-40b6-8a6f-00c817d7f036-kube-api-access-lbn4c" (OuterVolumeSpecName: "kube-api-access-lbn4c") pod "c5f73311-ec18-40b6-8a6f-00c817d7f036" (UID: "c5f73311-ec18-40b6-8a6f-00c817d7f036"). InnerVolumeSpecName "kube-api-access-lbn4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.442926 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-config-data" (OuterVolumeSpecName: "config-data") pod "c5f73311-ec18-40b6-8a6f-00c817d7f036" (UID: "c5f73311-ec18-40b6-8a6f-00c817d7f036"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.444512 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f73311-ec18-40b6-8a6f-00c817d7f036" (UID: "c5f73311-ec18-40b6-8a6f-00c817d7f036"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.504490 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.504525 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbn4c\" (UniqueName: \"kubernetes.io/projected/c5f73311-ec18-40b6-8a6f-00c817d7f036-kube-api-access-lbn4c\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.504538 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.504548 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5f73311-ec18-40b6-8a6f-00c817d7f036-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.660834 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:26:57 crc kubenswrapper[4732]: E1010 08:26:57.661115 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.943534 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jwcrg" event={"ID":"c5f73311-ec18-40b6-8a6f-00c817d7f036","Type":"ContainerDied","Data":"cee031f54a2161c9ac184e30c27f94eb98432e7bafa3604e2e915fbba809d5d0"} Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.943577 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee031f54a2161c9ac184e30c27f94eb98432e7bafa3604e2e915fbba809d5d0" Oct 10 08:26:57 crc kubenswrapper[4732]: I1010 08:26:57.943624 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jwcrg" Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.113239 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.113569 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" containerID="cri-o://98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03" gracePeriod=30 Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.113682 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" containerID="cri-o://6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4" gracePeriod=30 Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.126543 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.126786 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" containerID="cri-o://fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" gracePeriod=30 Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.152835 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.153283 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" containerID="cri-o://c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236" gracePeriod=30 Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.153460 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" containerID="cri-o://9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392" gracePeriod=30 Oct 10 08:26:58 crc kubenswrapper[4732]: E1010 08:26:58.159423 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:26:58 crc kubenswrapper[4732]: E1010 08:26:58.162944 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:26:58 crc kubenswrapper[4732]: E1010 08:26:58.164517 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:26:58 crc kubenswrapper[4732]: E1010 08:26:58.164570 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.956055 4732 generic.go:334] "Generic (PLEG): container finished" podID="aade023c-02ab-4532-a658-c29cba47a7ec" containerID="98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03" exitCode=143 Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.956125 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aade023c-02ab-4532-a658-c29cba47a7ec","Type":"ContainerDied","Data":"98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03"} Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.958898 4732 generic.go:334] "Generic (PLEG): container finished" podID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerID="c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236" exitCode=143 Oct 10 08:26:58 crc kubenswrapper[4732]: I1010 08:26:58.958929 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35fb4d88-d33b-43ef-8a71-0b60e2576d33","Type":"ContainerDied","Data":"c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236"} Oct 10 08:27:03 crc kubenswrapper[4732]: E1010 08:27:03.158758 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:03 crc kubenswrapper[4732]: E1010 08:27:03.161604 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:03 crc kubenswrapper[4732]: E1010 08:27:03.163587 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:03 crc kubenswrapper[4732]: E1010 08:27:03.163656 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:08 crc kubenswrapper[4732]: E1010 08:27:08.157836 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:08 crc kubenswrapper[4732]: E1010 08:27:08.161379 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:08 crc kubenswrapper[4732]: E1010 08:27:08.162873 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:08 crc kubenswrapper[4732]: E1010 08:27:08.162920 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:08 crc kubenswrapper[4732]: I1010 08:27:08.660549 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:27:08 crc kubenswrapper[4732]: E1010 08:27:08.661354 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.035982 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.047310 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.106175 4732 generic.go:334] "Generic (PLEG): container finished" podID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerID="9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392" exitCode=0 Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.106374 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35fb4d88-d33b-43ef-8a71-0b60e2576d33","Type":"ContainerDied","Data":"9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392"} Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.106508 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35fb4d88-d33b-43ef-8a71-0b60e2576d33","Type":"ContainerDied","Data":"c319ac427c2760a5347af405ee6658749c9c0fd4ffbb156ee72eb5778a6996d5"} Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.106417 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.106531 4732 scope.go:117] "RemoveContainer" containerID="9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.109138 4732 generic.go:334] "Generic (PLEG): container finished" podID="aade023c-02ab-4532-a658-c29cba47a7ec" containerID="6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4" exitCode=0 Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.109172 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aade023c-02ab-4532-a658-c29cba47a7ec","Type":"ContainerDied","Data":"6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4"} Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.109184 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.109192 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"aade023c-02ab-4532-a658-c29cba47a7ec","Type":"ContainerDied","Data":"4b5b4eef5ecfd68a55bcd856be4ccf2d724440857871aac093417d1d63b772e6"} Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.141015 4732 scope.go:117] "RemoveContainer" containerID="c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.157910 4732 scope.go:117] "RemoveContainer" containerID="9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.158304 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392\": container with ID starting with 9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392 not found: ID does not exist" containerID="9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.158372 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392"} err="failed to get container status \"9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392\": rpc error: code = NotFound desc = could not find container \"9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392\": container with ID starting with 9a41118797d00f6f25a1e37c07b51e6a14ac946d2d4e921909a5199c88311392 not found: ID does not exist" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.158393 4732 scope.go:117] "RemoveContainer" containerID="c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.158797 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236\": container with ID starting with c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236 not found: ID does not exist" containerID="c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.158859 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236"} err="failed to get container status \"c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236\": rpc error: code = NotFound desc = could not find container \"c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236\": container with ID starting with c697f1f2036a7dcafbd061dc023740f8388fc0c6a63803a57bbafde559636236 not found: ID does not exist" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.158877 4732 scope.go:117] "RemoveContainer" containerID="6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.180726 4732 scope.go:117] "RemoveContainer" containerID="98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.199897 4732 scope.go:117] "RemoveContainer" containerID="6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.200253 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4\": container with ID starting with 6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4 not found: ID does not exist" containerID="6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.200296 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4"} err="failed to get container status \"6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4\": rpc error: code = NotFound desc = could not find container \"6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4\": container with ID starting with 6ed814b0c493c073734cc34b16b677ff498517c849a0240c7f4d7183383854d4 not found: ID does not exist" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.200320 4732 scope.go:117] "RemoveContainer" containerID="98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.200578 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03\": container with ID starting with 98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03 not found: ID does not exist" containerID="98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.200606 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03"} err="failed to get container status \"98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03\": rpc error: code = NotFound desc = could not find container \"98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03\": container with ID starting with 98c4c48b6143aed814395e974b50d5b551ea45906e84be919185f35d57a8de03 not found: ID does not exist" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203279 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-combined-ca-bundle\") pod \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203335 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-config-data\") pod \"aade023c-02ab-4532-a658-c29cba47a7ec\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203365 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-config-data\") pod \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203385 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aade023c-02ab-4532-a658-c29cba47a7ec-logs\") pod \"aade023c-02ab-4532-a658-c29cba47a7ec\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203413 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz2ln\" (UniqueName: \"kubernetes.io/projected/35fb4d88-d33b-43ef-8a71-0b60e2576d33-kube-api-access-vz2ln\") pod \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203436 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35fb4d88-d33b-43ef-8a71-0b60e2576d33-logs\") pod \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203478 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-nova-metadata-tls-certs\") pod \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\" (UID: \"35fb4d88-d33b-43ef-8a71-0b60e2576d33\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203546 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-combined-ca-bundle\") pod \"aade023c-02ab-4532-a658-c29cba47a7ec\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.203571 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg99h\" (UniqueName: \"kubernetes.io/projected/aade023c-02ab-4532-a658-c29cba47a7ec-kube-api-access-xg99h\") pod \"aade023c-02ab-4532-a658-c29cba47a7ec\" (UID: \"aade023c-02ab-4532-a658-c29cba47a7ec\") " Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.204422 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aade023c-02ab-4532-a658-c29cba47a7ec-logs" (OuterVolumeSpecName: "logs") pod "aade023c-02ab-4532-a658-c29cba47a7ec" (UID: "aade023c-02ab-4532-a658-c29cba47a7ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.204786 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35fb4d88-d33b-43ef-8a71-0b60e2576d33-logs" (OuterVolumeSpecName: "logs") pod "35fb4d88-d33b-43ef-8a71-0b60e2576d33" (UID: "35fb4d88-d33b-43ef-8a71-0b60e2576d33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.208684 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35fb4d88-d33b-43ef-8a71-0b60e2576d33-kube-api-access-vz2ln" (OuterVolumeSpecName: "kube-api-access-vz2ln") pod "35fb4d88-d33b-43ef-8a71-0b60e2576d33" (UID: "35fb4d88-d33b-43ef-8a71-0b60e2576d33"). InnerVolumeSpecName "kube-api-access-vz2ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.208859 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aade023c-02ab-4532-a658-c29cba47a7ec-kube-api-access-xg99h" (OuterVolumeSpecName: "kube-api-access-xg99h") pod "aade023c-02ab-4532-a658-c29cba47a7ec" (UID: "aade023c-02ab-4532-a658-c29cba47a7ec"). InnerVolumeSpecName "kube-api-access-xg99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.227968 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-config-data" (OuterVolumeSpecName: "config-data") pod "aade023c-02ab-4532-a658-c29cba47a7ec" (UID: "aade023c-02ab-4532-a658-c29cba47a7ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.228209 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35fb4d88-d33b-43ef-8a71-0b60e2576d33" (UID: "35fb4d88-d33b-43ef-8a71-0b60e2576d33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.235801 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aade023c-02ab-4532-a658-c29cba47a7ec" (UID: "aade023c-02ab-4532-a658-c29cba47a7ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.237670 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-config-data" (OuterVolumeSpecName: "config-data") pod "35fb4d88-d33b-43ef-8a71-0b60e2576d33" (UID: "35fb4d88-d33b-43ef-8a71-0b60e2576d33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.256434 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "35fb4d88-d33b-43ef-8a71-0b60e2576d33" (UID: "35fb4d88-d33b-43ef-8a71-0b60e2576d33"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306005 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306057 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306077 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg99h\" (UniqueName: \"kubernetes.io/projected/aade023c-02ab-4532-a658-c29cba47a7ec-kube-api-access-xg99h\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306098 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306115 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aade023c-02ab-4532-a658-c29cba47a7ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306132 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35fb4d88-d33b-43ef-8a71-0b60e2576d33-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306149 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aade023c-02ab-4532-a658-c29cba47a7ec-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306167 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz2ln\" (UniqueName: \"kubernetes.io/projected/35fb4d88-d33b-43ef-8a71-0b60e2576d33-kube-api-access-vz2ln\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.306183 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35fb4d88-d33b-43ef-8a71-0b60e2576d33-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.447400 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.457164 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.486474 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.499607 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.510580 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.511147 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511172 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.511202 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511213 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.511249 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f73311-ec18-40b6-8a6f-00c817d7f036" containerName="nova-manage" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511258 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f73311-ec18-40b6-8a6f-00c817d7f036" containerName="nova-manage" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.511272 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511279 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.511293 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511301 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511542 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-log" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511565 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-log" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511578 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" containerName="nova-metadata-metadata" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511597 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f73311-ec18-40b6-8a6f-00c817d7f036" containerName="nova-manage" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.511611 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" containerName="nova-api-api" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.512972 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.517433 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.523792 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.525872 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.529538 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.529833 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.538631 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.570340 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.614533 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-logs\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.614575 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bkd\" (UniqueName: \"kubernetes.io/projected/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-kube-api-access-q4bkd\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.614709 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-logs\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.614965 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.615002 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-config-data\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.615071 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.615103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.615215 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kjb\" (UniqueName: \"kubernetes.io/projected/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-kube-api-access-j7kjb\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.615458 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-config-data\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: E1010 08:27:12.617607 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35fb4d88_d33b_43ef_8a71_0b60e2576d33.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaade023c_02ab_4532_a658_c29cba47a7ec.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35fb4d88_d33b_43ef_8a71_0b60e2576d33.slice/crio-c319ac427c2760a5347af405ee6658749c9c0fd4ffbb156ee72eb5778a6996d5\": RecentStats: unable to find data in memory cache]" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717112 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-config-data\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717174 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-logs\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717192 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bkd\" (UniqueName: \"kubernetes.io/projected/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-kube-api-access-q4bkd\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717225 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-logs\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717286 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-config-data\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717352 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717373 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kjb\" (UniqueName: \"kubernetes.io/projected/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-kube-api-access-j7kjb\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.717886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-logs\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.718482 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-logs\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.722181 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.725316 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-config-data\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.728919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.731767 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.732107 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-config-data\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.734491 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bkd\" (UniqueName: \"kubernetes.io/projected/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-kube-api-access-q4bkd\") pod \"nova-api-0\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " pod="openstack/nova-api-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.736033 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kjb\" (UniqueName: \"kubernetes.io/projected/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-kube-api-access-j7kjb\") pod \"nova-metadata-0\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.863070 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 08:27:12 crc kubenswrapper[4732]: I1010 08:27:12.867204 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:13 crc kubenswrapper[4732]: E1010 08:27:13.158079 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:13 crc kubenswrapper[4732]: E1010 08:27:13.159682 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:13 crc kubenswrapper[4732]: E1010 08:27:13.160682 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:13 crc kubenswrapper[4732]: E1010 08:27:13.160749 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:13 crc kubenswrapper[4732]: I1010 08:27:13.348708 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 08:27:13 crc kubenswrapper[4732]: I1010 08:27:13.412902 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:13 crc kubenswrapper[4732]: W1010 08:27:13.420471 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f36e48_d3ec_4cda_95d8_7f1d4f517abb.slice/crio-9cd89b374e18eeca28fc5f6e05c7bf0e9efaf0e7ae111cc46489b85b9754960c WatchSource:0}: Error finding container 9cd89b374e18eeca28fc5f6e05c7bf0e9efaf0e7ae111cc46489b85b9754960c: Status 404 returned error can't find the container with id 9cd89b374e18eeca28fc5f6e05c7bf0e9efaf0e7ae111cc46489b85b9754960c Oct 10 08:27:13 crc kubenswrapper[4732]: I1010 08:27:13.678094 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35fb4d88-d33b-43ef-8a71-0b60e2576d33" path="/var/lib/kubelet/pods/35fb4d88-d33b-43ef-8a71-0b60e2576d33/volumes" Oct 10 08:27:13 crc kubenswrapper[4732]: I1010 08:27:13.679208 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aade023c-02ab-4532-a658-c29cba47a7ec" path="/var/lib/kubelet/pods/aade023c-02ab-4532-a658-c29cba47a7ec/volumes" Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.131090 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f36e48-d3ec-4cda-95d8-7f1d4f517abb","Type":"ContainerStarted","Data":"47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70"} Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.131450 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f36e48-d3ec-4cda-95d8-7f1d4f517abb","Type":"ContainerStarted","Data":"017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138"} Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.131464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f36e48-d3ec-4cda-95d8-7f1d4f517abb","Type":"ContainerStarted","Data":"9cd89b374e18eeca28fc5f6e05c7bf0e9efaf0e7ae111cc46489b85b9754960c"} Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.133204 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d","Type":"ContainerStarted","Data":"a48747372aa6f781c05187465d1498190ad8ecb8a1bc0ce35ede767afbc3a5dd"} Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.133365 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d","Type":"ContainerStarted","Data":"fb4f7baff8f2803876f74e2918bc3d30d740ef069aabbb540f9291ef255d5fe1"} Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.133455 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d","Type":"ContainerStarted","Data":"4467d879c7a1594b61df29bd85bd400632968b5a75940da6b16bb7c143115145"} Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.174470 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.174452902 podStartE2EDuration="2.174452902s" podCreationTimestamp="2025-10-10 08:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:27:14.17328065 +0000 UTC m=+5761.242871901" watchObservedRunningTime="2025-10-10 08:27:14.174452902 +0000 UTC m=+5761.244044143" Oct 10 08:27:14 crc kubenswrapper[4732]: I1010 08:27:14.180323 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.180304348 podStartE2EDuration="2.180304348s" podCreationTimestamp="2025-10-10 08:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:27:14.155985237 +0000 UTC m=+5761.225576478" watchObservedRunningTime="2025-10-10 08:27:14.180304348 +0000 UTC m=+5761.249895589" Oct 10 08:27:17 crc kubenswrapper[4732]: I1010 08:27:17.864037 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:27:17 crc kubenswrapper[4732]: I1010 08:27:17.864354 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 08:27:18 crc kubenswrapper[4732]: E1010 08:27:18.159594 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:18 crc kubenswrapper[4732]: E1010 08:27:18.161554 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:18 crc kubenswrapper[4732]: E1010 08:27:18.163354 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:18 crc kubenswrapper[4732]: E1010 08:27:18.163420 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:21 crc kubenswrapper[4732]: I1010 08:27:21.660140 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:27:21 crc kubenswrapper[4732]: E1010 08:27:21.660687 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:27:22 crc kubenswrapper[4732]: I1010 08:27:22.863921 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:27:22 crc kubenswrapper[4732]: I1010 08:27:22.864036 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 08:27:22 crc kubenswrapper[4732]: I1010 08:27:22.867379 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:27:22 crc kubenswrapper[4732]: I1010 08:27:22.867414 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:27:23 crc kubenswrapper[4732]: E1010 08:27:23.158808 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:23 crc kubenswrapper[4732]: E1010 08:27:23.160332 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:23 crc kubenswrapper[4732]: E1010 08:27:23.162043 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:23 crc kubenswrapper[4732]: E1010 08:27:23.162094 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:23 crc kubenswrapper[4732]: I1010 08:27:23.878911 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.94:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:27:23 crc kubenswrapper[4732]: I1010 08:27:23.961070 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.93:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:27:23 crc kubenswrapper[4732]: I1010 08:27:23.961050 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.94:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:27:23 crc kubenswrapper[4732]: I1010 08:27:23.961091 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.93:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 08:27:28 crc kubenswrapper[4732]: E1010 08:27:28.155989 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b is running failed: container process not found" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:28 crc kubenswrapper[4732]: E1010 08:27:28.156765 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b is running failed: container process not found" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:28 crc kubenswrapper[4732]: E1010 08:27:28.157397 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b is running failed: container process not found" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 10 08:27:28 crc kubenswrapper[4732]: E1010 08:27:28.157433 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.331127 4732 generic.go:334] "Generic (PLEG): container finished" podID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" exitCode=137 Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.331214 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"569bf592-bd57-481d-9e3f-c4e909a44bf3","Type":"ContainerDied","Data":"fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b"} Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.539401 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.717061 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-config-data\") pod \"569bf592-bd57-481d-9e3f-c4e909a44bf3\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.717624 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-combined-ca-bundle\") pod \"569bf592-bd57-481d-9e3f-c4e909a44bf3\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.717757 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktgx5\" (UniqueName: \"kubernetes.io/projected/569bf592-bd57-481d-9e3f-c4e909a44bf3-kube-api-access-ktgx5\") pod \"569bf592-bd57-481d-9e3f-c4e909a44bf3\" (UID: \"569bf592-bd57-481d-9e3f-c4e909a44bf3\") " Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.731278 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/569bf592-bd57-481d-9e3f-c4e909a44bf3-kube-api-access-ktgx5" (OuterVolumeSpecName: "kube-api-access-ktgx5") pod "569bf592-bd57-481d-9e3f-c4e909a44bf3" (UID: "569bf592-bd57-481d-9e3f-c4e909a44bf3"). InnerVolumeSpecName "kube-api-access-ktgx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.750315 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "569bf592-bd57-481d-9e3f-c4e909a44bf3" (UID: "569bf592-bd57-481d-9e3f-c4e909a44bf3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.769338 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-config-data" (OuterVolumeSpecName: "config-data") pod "569bf592-bd57-481d-9e3f-c4e909a44bf3" (UID: "569bf592-bd57-481d-9e3f-c4e909a44bf3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.824913 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.826333 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktgx5\" (UniqueName: \"kubernetes.io/projected/569bf592-bd57-481d-9e3f-c4e909a44bf3-kube-api-access-ktgx5\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:28 crc kubenswrapper[4732]: I1010 08:27:28.826442 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/569bf592-bd57-481d-9e3f-c4e909a44bf3-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.345049 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"569bf592-bd57-481d-9e3f-c4e909a44bf3","Type":"ContainerDied","Data":"75ed557bbf76a1306f269a33210a22eef738dd8ad387a86e73a0e70dadab27f1"} Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.345105 4732 scope.go:117] "RemoveContainer" containerID="fd2c863addc45ff60444f05c5bd0c925522fa957b7ed92b028880a45e2f1622b" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.346314 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.387170 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.402429 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.420164 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:27:29 crc kubenswrapper[4732]: E1010 08:27:29.420666 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.420714 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.420976 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" containerName="nova-scheduler-scheduler" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.421769 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.427027 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.440970 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.541982 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-config-data\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.542086 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.542190 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk8qq\" (UniqueName: \"kubernetes.io/projected/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-kube-api-access-bk8qq\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.644348 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-config-data\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.644445 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.644493 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk8qq\" (UniqueName: \"kubernetes.io/projected/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-kube-api-access-bk8qq\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.651778 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-config-data\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.652111 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.664705 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk8qq\" (UniqueName: \"kubernetes.io/projected/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-kube-api-access-bk8qq\") pod \"nova-scheduler-0\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " pod="openstack/nova-scheduler-0" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.676887 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="569bf592-bd57-481d-9e3f-c4e909a44bf3" path="/var/lib/kubelet/pods/569bf592-bd57-481d-9e3f-c4e909a44bf3/volumes" Oct 10 08:27:29 crc kubenswrapper[4732]: I1010 08:27:29.750533 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 08:27:30 crc kubenswrapper[4732]: I1010 08:27:30.215268 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 08:27:30 crc kubenswrapper[4732]: W1010 08:27:30.218608 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04ac2b28_9f8e_40bb_9b11_1a70ebc6f745.slice/crio-1d9884f19e00c2cb0a36ebecc3d6966685fa1356cc8e05d5533b1f5e73078a7d WatchSource:0}: Error finding container 1d9884f19e00c2cb0a36ebecc3d6966685fa1356cc8e05d5533b1f5e73078a7d: Status 404 returned error can't find the container with id 1d9884f19e00c2cb0a36ebecc3d6966685fa1356cc8e05d5533b1f5e73078a7d Oct 10 08:27:30 crc kubenswrapper[4732]: I1010 08:27:30.356852 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745","Type":"ContainerStarted","Data":"1d9884f19e00c2cb0a36ebecc3d6966685fa1356cc8e05d5533b1f5e73078a7d"} Oct 10 08:27:31 crc kubenswrapper[4732]: I1010 08:27:31.367101 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745","Type":"ContainerStarted","Data":"bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d"} Oct 10 08:27:31 crc kubenswrapper[4732]: I1010 08:27:31.384924 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.384883079 podStartE2EDuration="2.384883079s" podCreationTimestamp="2025-10-10 08:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:27:31.382572217 +0000 UTC m=+5778.452163478" watchObservedRunningTime="2025-10-10 08:27:31.384883079 +0000 UTC m=+5778.454474350" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.661324 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:27:32 crc kubenswrapper[4732]: E1010 08:27:32.662183 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.872313 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.874071 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.874204 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.874266 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.875284 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.879315 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:27:32 crc kubenswrapper[4732]: I1010 08:27:32.883971 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.393654 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.397451 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.405714 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.611218 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65fd6b8f6f-bdkjv"] Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.613046 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.639243 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65fd6b8f6f-bdkjv"] Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.746407 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-sb\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.746671 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-nb\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.746703 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-config\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.746729 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-dns-svc\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.746770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/cfb3b72c-df51-4d81-9318-98f6e1393879-kube-api-access-ff5md\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.848597 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-sb\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.848671 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-nb\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.849466 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-nb\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.849563 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-sb\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.850210 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-config\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.850847 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-config\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.850923 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-dns-svc\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.851853 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-dns-svc\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.851909 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/cfb3b72c-df51-4d81-9318-98f6e1393879-kube-api-access-ff5md\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.863011 4732 scope.go:117] "RemoveContainer" containerID="3407d65780a652c9d4adf45e8752349c569186b2fa02ae356dd226bc901a3bbe" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.875604 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/cfb3b72c-df51-4d81-9318-98f6e1393879-kube-api-access-ff5md\") pod \"dnsmasq-dns-65fd6b8f6f-bdkjv\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.882249 4732 scope.go:117] "RemoveContainer" containerID="b12c3691d0cdba7073f7a1a1f7d9d957d30652b02a5c306783cef110128e1889" Oct 10 08:27:33 crc kubenswrapper[4732]: I1010 08:27:33.931610 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:34 crc kubenswrapper[4732]: I1010 08:27:34.098802 4732 scope.go:117] "RemoveContainer" containerID="a910e9de8a5fd88e36b61485dd740b0c45380d2f7ee7f977b7a0778bced7eacc" Oct 10 08:27:34 crc kubenswrapper[4732]: I1010 08:27:34.126134 4732 scope.go:117] "RemoveContainer" containerID="6c2d7ebbbb6224f73cc225552fb48fa7224a34348305f0e2c8cce9595095f4a2" Oct 10 08:27:34 crc kubenswrapper[4732]: I1010 08:27:34.177313 4732 scope.go:117] "RemoveContainer" containerID="6c01a50ac43d638037de195c4d8954f53c7bf57fd9c8f8cf61211366b022e3b7" Oct 10 08:27:34 crc kubenswrapper[4732]: W1010 08:27:34.428879 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfb3b72c_df51_4d81_9318_98f6e1393879.slice/crio-394c11ab6623a5fd0b3ab7b8296138bc1ef1fdc19b0cdab6fc16e5c318a12148 WatchSource:0}: Error finding container 394c11ab6623a5fd0b3ab7b8296138bc1ef1fdc19b0cdab6fc16e5c318a12148: Status 404 returned error can't find the container with id 394c11ab6623a5fd0b3ab7b8296138bc1ef1fdc19b0cdab6fc16e5c318a12148 Oct 10 08:27:34 crc kubenswrapper[4732]: I1010 08:27:34.438500 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65fd6b8f6f-bdkjv"] Oct 10 08:27:34 crc kubenswrapper[4732]: I1010 08:27:34.751676 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 08:27:35 crc kubenswrapper[4732]: I1010 08:27:35.412041 4732 generic.go:334] "Generic (PLEG): container finished" podID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerID="10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227" exitCode=0 Oct 10 08:27:35 crc kubenswrapper[4732]: I1010 08:27:35.412171 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" event={"ID":"cfb3b72c-df51-4d81-9318-98f6e1393879","Type":"ContainerDied","Data":"10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227"} Oct 10 08:27:35 crc kubenswrapper[4732]: I1010 08:27:35.412240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" event={"ID":"cfb3b72c-df51-4d81-9318-98f6e1393879","Type":"ContainerStarted","Data":"394c11ab6623a5fd0b3ab7b8296138bc1ef1fdc19b0cdab6fc16e5c318a12148"} Oct 10 08:27:36 crc kubenswrapper[4732]: I1010 08:27:36.147387 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:36 crc kubenswrapper[4732]: I1010 08:27:36.437737 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-log" containerID="cri-o://017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138" gracePeriod=30 Oct 10 08:27:36 crc kubenswrapper[4732]: I1010 08:27:36.439541 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" event={"ID":"cfb3b72c-df51-4d81-9318-98f6e1393879","Type":"ContainerStarted","Data":"9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230"} Oct 10 08:27:36 crc kubenswrapper[4732]: I1010 08:27:36.440058 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-api" containerID="cri-o://47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70" gracePeriod=30 Oct 10 08:27:36 crc kubenswrapper[4732]: I1010 08:27:36.440766 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:36 crc kubenswrapper[4732]: I1010 08:27:36.463551 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" podStartSLOduration=3.463510624 podStartE2EDuration="3.463510624s" podCreationTimestamp="2025-10-10 08:27:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:27:36.459750883 +0000 UTC m=+5783.529342144" watchObservedRunningTime="2025-10-10 08:27:36.463510624 +0000 UTC m=+5783.533101865" Oct 10 08:27:37 crc kubenswrapper[4732]: I1010 08:27:37.448912 4732 generic.go:334] "Generic (PLEG): container finished" podID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerID="017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138" exitCode=143 Oct 10 08:27:37 crc kubenswrapper[4732]: I1010 08:27:37.449022 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f36e48-d3ec-4cda-95d8-7f1d4f517abb","Type":"ContainerDied","Data":"017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138"} Oct 10 08:27:39 crc kubenswrapper[4732]: I1010 08:27:39.751535 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 08:27:39 crc kubenswrapper[4732]: I1010 08:27:39.781425 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.069578 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.181981 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-logs\") pod \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.182175 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4bkd\" (UniqueName: \"kubernetes.io/projected/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-kube-api-access-q4bkd\") pod \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.182280 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-combined-ca-bundle\") pod \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.182327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-config-data\") pod \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\" (UID: \"79f36e48-d3ec-4cda-95d8-7f1d4f517abb\") " Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.182480 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-logs" (OuterVolumeSpecName: "logs") pod "79f36e48-d3ec-4cda-95d8-7f1d4f517abb" (UID: "79f36e48-d3ec-4cda-95d8-7f1d4f517abb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.184142 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.189242 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-kube-api-access-q4bkd" (OuterVolumeSpecName: "kube-api-access-q4bkd") pod "79f36e48-d3ec-4cda-95d8-7f1d4f517abb" (UID: "79f36e48-d3ec-4cda-95d8-7f1d4f517abb"). InnerVolumeSpecName "kube-api-access-q4bkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.214268 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79f36e48-d3ec-4cda-95d8-7f1d4f517abb" (UID: "79f36e48-d3ec-4cda-95d8-7f1d4f517abb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.237037 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-config-data" (OuterVolumeSpecName: "config-data") pod "79f36e48-d3ec-4cda-95d8-7f1d4f517abb" (UID: "79f36e48-d3ec-4cda-95d8-7f1d4f517abb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.286493 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4bkd\" (UniqueName: \"kubernetes.io/projected/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-kube-api-access-q4bkd\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.286841 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.286852 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f36e48-d3ec-4cda-95d8-7f1d4f517abb-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.480159 4732 generic.go:334] "Generic (PLEG): container finished" podID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerID="47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70" exitCode=0 Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.481054 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.492179 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f36e48-d3ec-4cda-95d8-7f1d4f517abb","Type":"ContainerDied","Data":"47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70"} Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.492250 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f36e48-d3ec-4cda-95d8-7f1d4f517abb","Type":"ContainerDied","Data":"9cd89b374e18eeca28fc5f6e05c7bf0e9efaf0e7ae111cc46489b85b9754960c"} Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.492276 4732 scope.go:117] "RemoveContainer" containerID="47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.550302 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.551332 4732 scope.go:117] "RemoveContainer" containerID="017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.558848 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.562659 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.580626 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:40 crc kubenswrapper[4732]: E1010 08:27:40.581061 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-log" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.581073 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-log" Oct 10 08:27:40 crc kubenswrapper[4732]: E1010 08:27:40.581102 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-api" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.581108 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-api" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.581272 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-log" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.581286 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" containerName="nova-api-api" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.582311 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.584087 4732 scope.go:117] "RemoveContainer" containerID="47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.585011 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.585272 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.618914 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 10 08:27:40 crc kubenswrapper[4732]: E1010 08:27:40.619591 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70\": container with ID starting with 47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70 not found: ID does not exist" containerID="47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.619664 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.619653 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70"} err="failed to get container status \"47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70\": rpc error: code = NotFound desc = could not find container \"47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70\": container with ID starting with 47c3ef6c1a580bd33ae74f6c48ea11c7f05297a7b843feed5a3e4bbfbc92bc70 not found: ID does not exist" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.619817 4732 scope.go:117] "RemoveContainer" containerID="017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138" Oct 10 08:27:40 crc kubenswrapper[4732]: E1010 08:27:40.621191 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138\": container with ID starting with 017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138 not found: ID does not exist" containerID="017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.621236 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138"} err="failed to get container status \"017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138\": rpc error: code = NotFound desc = could not find container \"017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138\": container with ID starting with 017485a2b554819aafe92a362890d029b12c4d0464045aaafa67420249efa138 not found: ID does not exist" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.649714 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.649773 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.649813 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-config-data\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.649837 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f30848-6b41-4978-9b03-afcbb1e9618e-logs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.649872 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-public-tls-certs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.649987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgs7h\" (UniqueName: \"kubernetes.io/projected/88f30848-6b41-4978-9b03-afcbb1e9618e-kube-api-access-qgs7h\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.752212 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f30848-6b41-4978-9b03-afcbb1e9618e-logs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.752280 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-public-tls-certs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.752308 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgs7h\" (UniqueName: \"kubernetes.io/projected/88f30848-6b41-4978-9b03-afcbb1e9618e-kube-api-access-qgs7h\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.752458 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.752487 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.752520 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-config-data\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.752883 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f30848-6b41-4978-9b03-afcbb1e9618e-logs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.755680 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-public-tls-certs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.756017 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-config-data\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.759024 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.760350 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.773228 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgs7h\" (UniqueName: \"kubernetes.io/projected/88f30848-6b41-4978-9b03-afcbb1e9618e-kube-api-access-qgs7h\") pod \"nova-api-0\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " pod="openstack/nova-api-0" Oct 10 08:27:40 crc kubenswrapper[4732]: I1010 08:27:40.946681 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 08:27:41 crc kubenswrapper[4732]: I1010 08:27:41.398716 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 08:27:41 crc kubenswrapper[4732]: I1010 08:27:41.495321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f30848-6b41-4978-9b03-afcbb1e9618e","Type":"ContainerStarted","Data":"af3b23981bbc4b8a4d1e5685bc06dd84dcdf65636a20c626b0808dc9a484761a"} Oct 10 08:27:41 crc kubenswrapper[4732]: I1010 08:27:41.670688 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f36e48-d3ec-4cda-95d8-7f1d4f517abb" path="/var/lib/kubelet/pods/79f36e48-d3ec-4cda-95d8-7f1d4f517abb/volumes" Oct 10 08:27:42 crc kubenswrapper[4732]: I1010 08:27:42.508373 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f30848-6b41-4978-9b03-afcbb1e9618e","Type":"ContainerStarted","Data":"c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f"} Oct 10 08:27:42 crc kubenswrapper[4732]: I1010 08:27:42.509514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f30848-6b41-4978-9b03-afcbb1e9618e","Type":"ContainerStarted","Data":"c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468"} Oct 10 08:27:42 crc kubenswrapper[4732]: I1010 08:27:42.548178 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.548147826 podStartE2EDuration="2.548147826s" podCreationTimestamp="2025-10-10 08:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:27:42.530610417 +0000 UTC m=+5789.600201728" watchObservedRunningTime="2025-10-10 08:27:42.548147826 +0000 UTC m=+5789.617739107" Oct 10 08:27:43 crc kubenswrapper[4732]: I1010 08:27:43.934920 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.040915 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b995548b9-n79sv"] Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.041170 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" podUID="0413febe-2fe9-4567-a937-4a24918cac93" containerName="dnsmasq-dns" containerID="cri-o://75b4a4eb6e9af1da58b504db37375eda1c9467b6c6bd15631730c57bb1029754" gracePeriod=10 Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.553361 4732 generic.go:334] "Generic (PLEG): container finished" podID="0413febe-2fe9-4567-a937-4a24918cac93" containerID="75b4a4eb6e9af1da58b504db37375eda1c9467b6c6bd15631730c57bb1029754" exitCode=0 Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.553673 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" event={"ID":"0413febe-2fe9-4567-a937-4a24918cac93","Type":"ContainerDied","Data":"75b4a4eb6e9af1da58b504db37375eda1c9467b6c6bd15631730c57bb1029754"} Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.553718 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" event={"ID":"0413febe-2fe9-4567-a937-4a24918cac93","Type":"ContainerDied","Data":"048fb19035e4175da793f9e482c97ca630fd2834dbf9d31a34375f69b800d8b3"} Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.553730 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="048fb19035e4175da793f9e482c97ca630fd2834dbf9d31a34375f69b800d8b3" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.575855 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.742430 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-sb\") pod \"0413febe-2fe9-4567-a937-4a24918cac93\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.742485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-dns-svc\") pod \"0413febe-2fe9-4567-a937-4a24918cac93\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.742612 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8r54\" (UniqueName: \"kubernetes.io/projected/0413febe-2fe9-4567-a937-4a24918cac93-kube-api-access-w8r54\") pod \"0413febe-2fe9-4567-a937-4a24918cac93\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.742658 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-nb\") pod \"0413febe-2fe9-4567-a937-4a24918cac93\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.742749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-config\") pod \"0413febe-2fe9-4567-a937-4a24918cac93\" (UID: \"0413febe-2fe9-4567-a937-4a24918cac93\") " Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.748399 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0413febe-2fe9-4567-a937-4a24918cac93-kube-api-access-w8r54" (OuterVolumeSpecName: "kube-api-access-w8r54") pod "0413febe-2fe9-4567-a937-4a24918cac93" (UID: "0413febe-2fe9-4567-a937-4a24918cac93"). InnerVolumeSpecName "kube-api-access-w8r54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.800736 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0413febe-2fe9-4567-a937-4a24918cac93" (UID: "0413febe-2fe9-4567-a937-4a24918cac93"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.801768 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0413febe-2fe9-4567-a937-4a24918cac93" (UID: "0413febe-2fe9-4567-a937-4a24918cac93"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.808218 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0413febe-2fe9-4567-a937-4a24918cac93" (UID: "0413febe-2fe9-4567-a937-4a24918cac93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.815958 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-config" (OuterVolumeSpecName: "config") pod "0413febe-2fe9-4567-a937-4a24918cac93" (UID: "0413febe-2fe9-4567-a937-4a24918cac93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.845525 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.845582 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.845593 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8r54\" (UniqueName: \"kubernetes.io/projected/0413febe-2fe9-4567-a937-4a24918cac93-kube-api-access-w8r54\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.845605 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:44 crc kubenswrapper[4732]: I1010 08:27:44.845613 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0413febe-2fe9-4567-a937-4a24918cac93-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:27:45 crc kubenswrapper[4732]: I1010 08:27:45.566118 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b995548b9-n79sv" Oct 10 08:27:45 crc kubenswrapper[4732]: I1010 08:27:45.600631 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b995548b9-n79sv"] Oct 10 08:27:45 crc kubenswrapper[4732]: I1010 08:27:45.607991 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b995548b9-n79sv"] Oct 10 08:27:45 crc kubenswrapper[4732]: I1010 08:27:45.672188 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0413febe-2fe9-4567-a937-4a24918cac93" path="/var/lib/kubelet/pods/0413febe-2fe9-4567-a937-4a24918cac93/volumes" Oct 10 08:27:46 crc kubenswrapper[4732]: I1010 08:27:46.660647 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:27:46 crc kubenswrapper[4732]: E1010 08:27:46.661396 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:27:50 crc kubenswrapper[4732]: I1010 08:27:50.947603 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:27:50 crc kubenswrapper[4732]: I1010 08:27:50.948362 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 08:27:51 crc kubenswrapper[4732]: I1010 08:27:51.959920 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.97:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:27:51 crc kubenswrapper[4732]: I1010 08:27:51.959943 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.97:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 08:27:57 crc kubenswrapper[4732]: I1010 08:27:57.660678 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:27:57 crc kubenswrapper[4732]: E1010 08:27:57.661789 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:28:00 crc kubenswrapper[4732]: I1010 08:28:00.956175 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:28:00 crc kubenswrapper[4732]: I1010 08:28:00.957310 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:28:00 crc kubenswrapper[4732]: I1010 08:28:00.959102 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 08:28:00 crc kubenswrapper[4732]: I1010 08:28:00.964855 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:28:01 crc kubenswrapper[4732]: I1010 08:28:01.720359 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 08:28:01 crc kubenswrapper[4732]: I1010 08:28:01.730461 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 08:28:12 crc kubenswrapper[4732]: I1010 08:28:12.662042 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:28:12 crc kubenswrapper[4732]: E1010 08:28:12.663345 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.826289 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67f68475f-2lmjl"] Oct 10 08:28:13 crc kubenswrapper[4732]: E1010 08:28:13.827090 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0413febe-2fe9-4567-a937-4a24918cac93" containerName="init" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.827108 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0413febe-2fe9-4567-a937-4a24918cac93" containerName="init" Oct 10 08:28:13 crc kubenswrapper[4732]: E1010 08:28:13.827155 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0413febe-2fe9-4567-a937-4a24918cac93" containerName="dnsmasq-dns" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.827163 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0413febe-2fe9-4567-a937-4a24918cac93" containerName="dnsmasq-dns" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.827484 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0413febe-2fe9-4567-a937-4a24918cac93" containerName="dnsmasq-dns" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.830184 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.838458 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-9tnrt" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.838726 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.838994 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.839356 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.847202 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67f68475f-2lmjl"] Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.901473 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.901775 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-log" containerID="cri-o://cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f" gracePeriod=30 Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.902327 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-httpd" containerID="cri-o://7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d" gracePeriod=30 Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.906736 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-horizon-secret-key\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.906845 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-scripts\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.906908 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-config-data\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.906937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgdn\" (UniqueName: \"kubernetes.io/projected/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-kube-api-access-qvgdn\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.906968 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-logs\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.933757 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.934029 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-log" containerID="cri-o://a712184ddbc53f6abd6e6a0fb6febfb97002bf450ffb8ee97fe9774aae0ec1ed" gracePeriod=30 Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.934442 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-httpd" containerID="cri-o://9f828a17179e7faacae59ede9226a1ecb9fd4da98b97042352f4833a133dd673" gracePeriod=30 Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.949000 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c997bdff-lpkdx"] Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.950493 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:13 crc kubenswrapper[4732]: I1010 08:28:13.968274 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c997bdff-lpkdx"] Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009290 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-config-data\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009345 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgdn\" (UniqueName: \"kubernetes.io/projected/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-kube-api-access-qvgdn\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tffjs\" (UniqueName: \"kubernetes.io/projected/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-kube-api-access-tffjs\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009411 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-logs\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009428 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-scripts\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009448 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-horizon-secret-key\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009468 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-horizon-secret-key\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-config-data\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-scripts\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.009592 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-logs\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.010905 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-logs\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.011021 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-config-data\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.011033 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-scripts\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.017204 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-horizon-secret-key\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.033886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgdn\" (UniqueName: \"kubernetes.io/projected/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-kube-api-access-qvgdn\") pod \"horizon-67f68475f-2lmjl\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.110657 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-logs\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.110759 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tffjs\" (UniqueName: \"kubernetes.io/projected/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-kube-api-access-tffjs\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.110795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-scripts\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.110816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-horizon-secret-key\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.110854 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-config-data\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.111158 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-logs\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.111728 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-scripts\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.111978 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-config-data\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.115040 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-horizon-secret-key\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.126305 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tffjs\" (UniqueName: \"kubernetes.io/projected/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-kube-api-access-tffjs\") pod \"horizon-7c997bdff-lpkdx\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.169636 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.287549 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:14 crc kubenswrapper[4732]: W1010 08:28:14.647246 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dfc26fa_a8d0_4dc7_b24e_a9d9460828d1.slice/crio-5466258295daf42c4bbb84486c5e4a4cf0f41ecf7a1fad53dfde91dc8a98de9c WatchSource:0}: Error finding container 5466258295daf42c4bbb84486c5e4a4cf0f41ecf7a1fad53dfde91dc8a98de9c: Status 404 returned error can't find the container with id 5466258295daf42c4bbb84486c5e4a4cf0f41ecf7a1fad53dfde91dc8a98de9c Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.649329 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67f68475f-2lmjl"] Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.760050 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c997bdff-lpkdx"] Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.884373 4732 generic.go:334] "Generic (PLEG): container finished" podID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerID="a712184ddbc53f6abd6e6a0fb6febfb97002bf450ffb8ee97fe9774aae0ec1ed" exitCode=143 Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.884436 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c4190a9-ad8a-4274-8860-a527e05aa3f5","Type":"ContainerDied","Data":"a712184ddbc53f6abd6e6a0fb6febfb97002bf450ffb8ee97fe9774aae0ec1ed"} Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.885416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c997bdff-lpkdx" event={"ID":"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265","Type":"ContainerStarted","Data":"8269b7bc6ffbcc9488a92348c0c6bd303c19198023930138df4edc8a4b04b6d5"} Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.886338 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f68475f-2lmjl" event={"ID":"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1","Type":"ContainerStarted","Data":"5466258295daf42c4bbb84486c5e4a4cf0f41ecf7a1fad53dfde91dc8a98de9c"} Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.889264 4732 generic.go:334] "Generic (PLEG): container finished" podID="66953b37-077e-49e6-905a-67c904325828" containerID="cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f" exitCode=143 Oct 10 08:28:14 crc kubenswrapper[4732]: I1010 08:28:14.889314 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66953b37-077e-49e6-905a-67c904325828","Type":"ContainerDied","Data":"cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f"} Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.832928 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67f68475f-2lmjl"] Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.860413 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c57dc8cd4-tczn7"] Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.861951 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.865226 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.884627 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c57dc8cd4-tczn7"] Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.943686 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c997bdff-lpkdx"] Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.973614 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cbd6844cb-rnrwt"] Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.975153 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.976861 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-secret-key\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.976928 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9kb\" (UniqueName: \"kubernetes.io/projected/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-kube-api-access-2c9kb\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.976955 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-scripts\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.977016 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-logs\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.977036 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-tls-certs\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.977101 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-config-data\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.977119 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-combined-ca-bundle\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:15 crc kubenswrapper[4732]: I1010 08:28:15.983360 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cbd6844cb-rnrwt"] Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.078502 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-secret-key\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.078555 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk5r5\" (UniqueName: \"kubernetes.io/projected/f934d435-c311-497b-9298-43a3f48e717f-kube-api-access-wk5r5\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.078585 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-combined-ca-bundle\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.078821 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-config-data\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.078856 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-combined-ca-bundle\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.078948 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-secret-key\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.078980 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-config-data\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.079056 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-tls-certs\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.079148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9kb\" (UniqueName: \"kubernetes.io/projected/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-kube-api-access-2c9kb\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.079198 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-scripts\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.079272 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934d435-c311-497b-9298-43a3f48e717f-logs\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.079348 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-scripts\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.079563 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-logs\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.079591 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-tls-certs\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.080016 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-scripts\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.080027 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-logs\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.080197 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-config-data\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.086066 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-secret-key\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.086103 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-combined-ca-bundle\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.087219 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-tls-certs\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.117093 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9kb\" (UniqueName: \"kubernetes.io/projected/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-kube-api-access-2c9kb\") pod \"horizon-7c57dc8cd4-tczn7\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.181493 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934d435-c311-497b-9298-43a3f48e717f-logs\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.181556 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-scripts\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.181782 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-secret-key\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.181832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk5r5\" (UniqueName: \"kubernetes.io/projected/f934d435-c311-497b-9298-43a3f48e717f-kube-api-access-wk5r5\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.181849 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-combined-ca-bundle\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.181911 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-config-data\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.181939 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-tls-certs\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.184019 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934d435-c311-497b-9298-43a3f48e717f-logs\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.184959 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-scripts\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.186967 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-secret-key\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.187100 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-combined-ca-bundle\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.187887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-config-data\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.193309 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-tls-certs\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.205550 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.209930 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk5r5\" (UniqueName: \"kubernetes.io/projected/f934d435-c311-497b-9298-43a3f48e717f-kube-api-access-wk5r5\") pod \"horizon-5cbd6844cb-rnrwt\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.297550 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.651359 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c57dc8cd4-tczn7"] Oct 10 08:28:16 crc kubenswrapper[4732]: W1010 08:28:16.660119 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eb6dfe9_b83e_4930_a2b5_7b92c2e4f405.slice/crio-135f77760a6dd0e855d359a1f1e844b26963e8443d268244edd5987b2902b182 WatchSource:0}: Error finding container 135f77760a6dd0e855d359a1f1e844b26963e8443d268244edd5987b2902b182: Status 404 returned error can't find the container with id 135f77760a6dd0e855d359a1f1e844b26963e8443d268244edd5987b2902b182 Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.763851 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cbd6844cb-rnrwt"] Oct 10 08:28:16 crc kubenswrapper[4732]: W1010 08:28:16.770401 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf934d435_c311_497b_9298_43a3f48e717f.slice/crio-e15d3ed3dd4397268106ff22be226950a45da2f33a9fb897bc46d5d4b0f278d2 WatchSource:0}: Error finding container e15d3ed3dd4397268106ff22be226950a45da2f33a9fb897bc46d5d4b0f278d2: Status 404 returned error can't find the container with id e15d3ed3dd4397268106ff22be226950a45da2f33a9fb897bc46d5d4b0f278d2 Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.925917 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c57dc8cd4-tczn7" event={"ID":"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405","Type":"ContainerStarted","Data":"135f77760a6dd0e855d359a1f1e844b26963e8443d268244edd5987b2902b182"} Oct 10 08:28:16 crc kubenswrapper[4732]: I1010 08:28:16.926891 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbd6844cb-rnrwt" event={"ID":"f934d435-c311-497b-9298-43a3f48e717f","Type":"ContainerStarted","Data":"e15d3ed3dd4397268106ff22be226950a45da2f33a9fb897bc46d5d4b0f278d2"} Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.588169 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.719231 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-config-data\") pod \"66953b37-077e-49e6-905a-67c904325828\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.719276 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-public-tls-certs\") pod \"66953b37-077e-49e6-905a-67c904325828\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.719373 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-httpd-run\") pod \"66953b37-077e-49e6-905a-67c904325828\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.719434 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-combined-ca-bundle\") pod \"66953b37-077e-49e6-905a-67c904325828\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.719519 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmlc8\" (UniqueName: \"kubernetes.io/projected/66953b37-077e-49e6-905a-67c904325828-kube-api-access-hmlc8\") pod \"66953b37-077e-49e6-905a-67c904325828\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.719573 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-logs\") pod \"66953b37-077e-49e6-905a-67c904325828\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.719587 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-scripts\") pod \"66953b37-077e-49e6-905a-67c904325828\" (UID: \"66953b37-077e-49e6-905a-67c904325828\") " Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.722270 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "66953b37-077e-49e6-905a-67c904325828" (UID: "66953b37-077e-49e6-905a-67c904325828"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.722326 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-logs" (OuterVolumeSpecName: "logs") pod "66953b37-077e-49e6-905a-67c904325828" (UID: "66953b37-077e-49e6-905a-67c904325828"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.725528 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66953b37-077e-49e6-905a-67c904325828-kube-api-access-hmlc8" (OuterVolumeSpecName: "kube-api-access-hmlc8") pod "66953b37-077e-49e6-905a-67c904325828" (UID: "66953b37-077e-49e6-905a-67c904325828"). InnerVolumeSpecName "kube-api-access-hmlc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.727483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-scripts" (OuterVolumeSpecName: "scripts") pod "66953b37-077e-49e6-905a-67c904325828" (UID: "66953b37-077e-49e6-905a-67c904325828"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.754378 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66953b37-077e-49e6-905a-67c904325828" (UID: "66953b37-077e-49e6-905a-67c904325828"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.784327 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "66953b37-077e-49e6-905a-67c904325828" (UID: "66953b37-077e-49e6-905a-67c904325828"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.785320 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-config-data" (OuterVolumeSpecName: "config-data") pod "66953b37-077e-49e6-905a-67c904325828" (UID: "66953b37-077e-49e6-905a-67c904325828"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.822221 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.822279 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.822293 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.822306 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.822320 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmlc8\" (UniqueName: \"kubernetes.io/projected/66953b37-077e-49e6-905a-67c904325828-kube-api-access-hmlc8\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.822332 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/66953b37-077e-49e6-905a-67c904325828-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.822342 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66953b37-077e-49e6-905a-67c904325828-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.937003 4732 generic.go:334] "Generic (PLEG): container finished" podID="66953b37-077e-49e6-905a-67c904325828" containerID="7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d" exitCode=0 Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.937067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66953b37-077e-49e6-905a-67c904325828","Type":"ContainerDied","Data":"7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d"} Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.937092 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"66953b37-077e-49e6-905a-67c904325828","Type":"ContainerDied","Data":"52feb45cf3468ee9c1146a019110d74496ce34550ab0567cda6c20bda1705964"} Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.937111 4732 scope.go:117] "RemoveContainer" containerID="7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.937213 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.940296 4732 generic.go:334] "Generic (PLEG): container finished" podID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerID="9f828a17179e7faacae59ede9226a1ecb9fd4da98b97042352f4833a133dd673" exitCode=0 Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.940334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c4190a9-ad8a-4274-8860-a527e05aa3f5","Type":"ContainerDied","Data":"9f828a17179e7faacae59ede9226a1ecb9fd4da98b97042352f4833a133dd673"} Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.971944 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.979946 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.993985 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:28:17 crc kubenswrapper[4732]: E1010 08:28:17.994464 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-log" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.994490 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-log" Oct 10 08:28:17 crc kubenswrapper[4732]: E1010 08:28:17.994511 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-httpd" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.994518 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-httpd" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.994674 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-log" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.994711 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="66953b37-077e-49e6-905a-67c904325828" containerName="glance-httpd" Oct 10 08:28:17 crc kubenswrapper[4732]: I1010 08:28:17.996380 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.001525 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.001744 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.010183 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.133951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.134005 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-logs\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.134045 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt265\" (UniqueName: \"kubernetes.io/projected/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-kube-api-access-tt265\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.134211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.134391 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-scripts\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.134732 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-config-data\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.134960 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.236739 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.237030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.237072 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-logs\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.237108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt265\" (UniqueName: \"kubernetes.io/projected/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-kube-api-access-tt265\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.237143 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.237172 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-scripts\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.237215 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-config-data\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.237986 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.238093 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-logs\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.244151 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-config-data\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.252618 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.252707 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.255474 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-scripts\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.281599 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt265\" (UniqueName: \"kubernetes.io/projected/2799c38e-8a3c-47ca-95c8-c401ce8f5c54-kube-api-access-tt265\") pod \"glance-default-external-api-0\" (UID: \"2799c38e-8a3c-47ca-95c8-c401ce8f5c54\") " pod="openstack/glance-default-external-api-0" Oct 10 08:28:18 crc kubenswrapper[4732]: I1010 08:28:18.337652 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 10 08:28:19 crc kubenswrapper[4732]: I1010 08:28:19.682813 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66953b37-077e-49e6-905a-67c904325828" path="/var/lib/kubelet/pods/66953b37-077e-49e6-905a-67c904325828/volumes" Oct 10 08:28:21 crc kubenswrapper[4732]: I1010 08:28:21.045883 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cnq2d"] Oct 10 08:28:21 crc kubenswrapper[4732]: I1010 08:28:21.059419 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cnq2d"] Oct 10 08:28:21 crc kubenswrapper[4732]: I1010 08:28:21.671829 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4475e71-36e3-4a0e-a144-a511d65d1cc3" path="/var/lib/kubelet/pods/a4475e71-36e3-4a0e-a144-a511d65d1cc3/volumes" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.192406 4732 scope.go:117] "RemoveContainer" containerID="cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.340130 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.393323 4732 scope.go:117] "RemoveContainer" containerID="7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d" Oct 10 08:28:23 crc kubenswrapper[4732]: E1010 08:28:23.395775 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d\": container with ID starting with 7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d not found: ID does not exist" containerID="7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.395816 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d"} err="failed to get container status \"7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d\": rpc error: code = NotFound desc = could not find container \"7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d\": container with ID starting with 7e911d0b39f6bf1718b73129b0b2136235bad7f8b06893047b6ca4e60b4ee15d not found: ID does not exist" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.395841 4732 scope.go:117] "RemoveContainer" containerID="cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f" Oct 10 08:28:23 crc kubenswrapper[4732]: E1010 08:28:23.396084 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f\": container with ID starting with cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f not found: ID does not exist" containerID="cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.396100 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f"} err="failed to get container status \"cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f\": rpc error: code = NotFound desc = could not find container \"cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f\": container with ID starting with cd9bfe797655e8c7f80a3424e962c319dc2af4d196efacffee5c94784c7c701f not found: ID does not exist" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.445001 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk4pm\" (UniqueName: \"kubernetes.io/projected/4c4190a9-ad8a-4274-8860-a527e05aa3f5-kube-api-access-dk4pm\") pod \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.445099 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-config-data\") pod \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.445147 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-combined-ca-bundle\") pod \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.445185 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-scripts\") pod \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.445231 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-logs\") pod \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.445257 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-internal-tls-certs\") pod \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.445417 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-httpd-run\") pod \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\" (UID: \"4c4190a9-ad8a-4274-8860-a527e05aa3f5\") " Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.446321 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4c4190a9-ad8a-4274-8860-a527e05aa3f5" (UID: "4c4190a9-ad8a-4274-8860-a527e05aa3f5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.446415 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-logs" (OuterVolumeSpecName: "logs") pod "4c4190a9-ad8a-4274-8860-a527e05aa3f5" (UID: "4c4190a9-ad8a-4274-8860-a527e05aa3f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.451665 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-scripts" (OuterVolumeSpecName: "scripts") pod "4c4190a9-ad8a-4274-8860-a527e05aa3f5" (UID: "4c4190a9-ad8a-4274-8860-a527e05aa3f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.452344 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c4190a9-ad8a-4274-8860-a527e05aa3f5-kube-api-access-dk4pm" (OuterVolumeSpecName: "kube-api-access-dk4pm") pod "4c4190a9-ad8a-4274-8860-a527e05aa3f5" (UID: "4c4190a9-ad8a-4274-8860-a527e05aa3f5"). InnerVolumeSpecName "kube-api-access-dk4pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.479876 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c4190a9-ad8a-4274-8860-a527e05aa3f5" (UID: "4c4190a9-ad8a-4274-8860-a527e05aa3f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.524941 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-config-data" (OuterVolumeSpecName: "config-data") pod "4c4190a9-ad8a-4274-8860-a527e05aa3f5" (UID: "4c4190a9-ad8a-4274-8860-a527e05aa3f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.530642 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4c4190a9-ad8a-4274-8860-a527e05aa3f5" (UID: "4c4190a9-ad8a-4274-8860-a527e05aa3f5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.547857 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.547940 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.547953 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.547983 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.547992 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4190a9-ad8a-4274-8860-a527e05aa3f5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.548000 4732 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4c4190a9-ad8a-4274-8860-a527e05aa3f5-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.548008 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk4pm\" (UniqueName: \"kubernetes.io/projected/4c4190a9-ad8a-4274-8860-a527e05aa3f5-kube-api-access-dk4pm\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:23 crc kubenswrapper[4732]: I1010 08:28:23.827013 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 10 08:28:23 crc kubenswrapper[4732]: W1010 08:28:23.847642 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2799c38e_8a3c_47ca_95c8_c401ce8f5c54.slice/crio-a3fd3f7cc5da6bc673d0b4f3165b4b032e3da47870a7708af589eb9b0ee605db WatchSource:0}: Error finding container a3fd3f7cc5da6bc673d0b4f3165b4b032e3da47870a7708af589eb9b0ee605db: Status 404 returned error can't find the container with id a3fd3f7cc5da6bc673d0b4f3165b4b032e3da47870a7708af589eb9b0ee605db Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.009937 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4c4190a9-ad8a-4274-8860-a527e05aa3f5","Type":"ContainerDied","Data":"a8835282a8857a058b41def75d56bb70bac3dc85ba7697f3d6f3a9c18852c48c"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.010248 4732 scope.go:117] "RemoveContainer" containerID="9f828a17179e7faacae59ede9226a1ecb9fd4da98b97042352f4833a133dd673" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.010037 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.015315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c997bdff-lpkdx" event={"ID":"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265","Type":"ContainerStarted","Data":"82515ccfd4cca81f2923c70265dcf6d1b7bbf47b65ca3cea907fa8f56a673b23"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.015358 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c997bdff-lpkdx" event={"ID":"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265","Type":"ContainerStarted","Data":"ba8f1c162fbb30d804227aa342d0d9ae239c6f5ac41fa7e5ec5cec1567319e91"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.015476 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c997bdff-lpkdx" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon-log" containerID="cri-o://ba8f1c162fbb30d804227aa342d0d9ae239c6f5ac41fa7e5ec5cec1567319e91" gracePeriod=30 Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.015730 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c997bdff-lpkdx" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon" containerID="cri-o://82515ccfd4cca81f2923c70265dcf6d1b7bbf47b65ca3cea907fa8f56a673b23" gracePeriod=30 Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.028204 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c57dc8cd4-tczn7" event={"ID":"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405","Type":"ContainerStarted","Data":"6a09e58856797a187d74e1f905d57adce1f029974d19d550f716a8403539b9d4"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.028247 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c57dc8cd4-tczn7" event={"ID":"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405","Type":"ContainerStarted","Data":"b9cf5c9faa7b52514ec956e63496a451122ca7662be50f389e9570422a13d3a1"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.037472 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f68475f-2lmjl" event={"ID":"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1","Type":"ContainerStarted","Data":"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.037521 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f68475f-2lmjl" event={"ID":"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1","Type":"ContainerStarted","Data":"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.037623 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67f68475f-2lmjl" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon-log" containerID="cri-o://52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d" gracePeriod=30 Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.037643 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67f68475f-2lmjl" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon" containerID="cri-o://106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7" gracePeriod=30 Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.038524 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.043599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbd6844cb-rnrwt" event={"ID":"f934d435-c311-497b-9298-43a3f48e717f","Type":"ContainerStarted","Data":"52ea17f600f69a171aaaaa793ba0984c355be42198e0ebdf848e2a5cefd69ab6"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.043643 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbd6844cb-rnrwt" event={"ID":"f934d435-c311-497b-9298-43a3f48e717f","Type":"ContainerStarted","Data":"a39a3119019264e0284f247dfa899bd69435ff229af03952c369ca4a788c539e"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.049919 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.059034 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2799c38e-8a3c-47ca-95c8-c401ce8f5c54","Type":"ContainerStarted","Data":"a3fd3f7cc5da6bc673d0b4f3165b4b032e3da47870a7708af589eb9b0ee605db"} Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.059422 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:28:24 crc kubenswrapper[4732]: E1010 08:28:24.060373 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-log" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.060507 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-log" Oct 10 08:28:24 crc kubenswrapper[4732]: E1010 08:28:24.060541 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-httpd" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.060550 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-httpd" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.061923 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-log" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.061958 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" containerName="glance-httpd" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.066764 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.072043 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c997bdff-lpkdx" podStartSLOduration=2.566582456 podStartE2EDuration="11.072023333s" podCreationTimestamp="2025-10-10 08:28:13 +0000 UTC" firstStartedPulling="2025-10-10 08:28:14.770223001 +0000 UTC m=+5821.839814252" lastFinishedPulling="2025-10-10 08:28:23.275663898 +0000 UTC m=+5830.345255129" observedRunningTime="2025-10-10 08:28:24.061382758 +0000 UTC m=+5831.130974019" watchObservedRunningTime="2025-10-10 08:28:24.072023333 +0000 UTC m=+5831.141614574" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.074963 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.075260 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.091974 4732 scope.go:117] "RemoveContainer" containerID="a712184ddbc53f6abd6e6a0fb6febfb97002bf450ffb8ee97fe9774aae0ec1ed" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.107480 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.114601 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c57dc8cd4-tczn7" podStartSLOduration=2.461101958 podStartE2EDuration="9.114581053s" podCreationTimestamp="2025-10-10 08:28:15 +0000 UTC" firstStartedPulling="2025-10-10 08:28:16.662329868 +0000 UTC m=+5823.731921109" lastFinishedPulling="2025-10-10 08:28:23.315808933 +0000 UTC m=+5830.385400204" observedRunningTime="2025-10-10 08:28:24.08944715 +0000 UTC m=+5831.159038411" watchObservedRunningTime="2025-10-10 08:28:24.114581053 +0000 UTC m=+5831.184172294" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.137360 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67f68475f-2lmjl" podStartSLOduration=2.379500136 podStartE2EDuration="11.137339932s" podCreationTimestamp="2025-10-10 08:28:13 +0000 UTC" firstStartedPulling="2025-10-10 08:28:14.649896269 +0000 UTC m=+5821.719487510" lastFinishedPulling="2025-10-10 08:28:23.407736065 +0000 UTC m=+5830.477327306" observedRunningTime="2025-10-10 08:28:24.128931087 +0000 UTC m=+5831.198522348" watchObservedRunningTime="2025-10-10 08:28:24.137339932 +0000 UTC m=+5831.206931183" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.157752 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cbd6844cb-rnrwt" podStartSLOduration=2.521735071 podStartE2EDuration="9.157735498s" podCreationTimestamp="2025-10-10 08:28:15 +0000 UTC" firstStartedPulling="2025-10-10 08:28:16.772314203 +0000 UTC m=+5823.841905444" lastFinishedPulling="2025-10-10 08:28:23.40831463 +0000 UTC m=+5830.477905871" observedRunningTime="2025-10-10 08:28:24.154122641 +0000 UTC m=+5831.223713902" watchObservedRunningTime="2025-10-10 08:28:24.157735498 +0000 UTC m=+5831.227326739" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.161534 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.161589 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f566fa71-b966-4a19-8843-9d2530fe37a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.161641 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqtw\" (UniqueName: \"kubernetes.io/projected/f566fa71-b966-4a19-8843-9d2530fe37a2-kube-api-access-4jqtw\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.161805 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f566fa71-b966-4a19-8843-9d2530fe37a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.161855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.162029 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.162063 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.170872 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.263590 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.263639 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f566fa71-b966-4a19-8843-9d2530fe37a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.263678 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqtw\" (UniqueName: \"kubernetes.io/projected/f566fa71-b966-4a19-8843-9d2530fe37a2-kube-api-access-4jqtw\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.263730 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f566fa71-b966-4a19-8843-9d2530fe37a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.263764 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.263816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.263839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.264334 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f566fa71-b966-4a19-8843-9d2530fe37a2-logs\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.265319 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f566fa71-b966-4a19-8843-9d2530fe37a2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.268814 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.272186 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.272345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.272763 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f566fa71-b966-4a19-8843-9d2530fe37a2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.282268 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqtw\" (UniqueName: \"kubernetes.io/projected/f566fa71-b966-4a19-8843-9d2530fe37a2-kube-api-access-4jqtw\") pod \"glance-default-internal-api-0\" (UID: \"f566fa71-b966-4a19-8843-9d2530fe37a2\") " pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.289161 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.408455 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:24 crc kubenswrapper[4732]: I1010 08:28:24.935209 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 10 08:28:24 crc kubenswrapper[4732]: W1010 08:28:24.942672 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf566fa71_b966_4a19_8843_9d2530fe37a2.slice/crio-09a65ed2ad9a15c29a54e69d858a99904ca023fded8055839c3c54584ea22546 WatchSource:0}: Error finding container 09a65ed2ad9a15c29a54e69d858a99904ca023fded8055839c3c54584ea22546: Status 404 returned error can't find the container with id 09a65ed2ad9a15c29a54e69d858a99904ca023fded8055839c3c54584ea22546 Oct 10 08:28:25 crc kubenswrapper[4732]: I1010 08:28:25.099835 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2799c38e-8a3c-47ca-95c8-c401ce8f5c54","Type":"ContainerStarted","Data":"c20e54a97eb46f8927da7ecf495ba2bceb2998e38dbf9c4362a13643dab673ab"} Oct 10 08:28:25 crc kubenswrapper[4732]: I1010 08:28:25.104372 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f566fa71-b966-4a19-8843-9d2530fe37a2","Type":"ContainerStarted","Data":"09a65ed2ad9a15c29a54e69d858a99904ca023fded8055839c3c54584ea22546"} Oct 10 08:28:25 crc kubenswrapper[4732]: I1010 08:28:25.671635 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c4190a9-ad8a-4274-8860-a527e05aa3f5" path="/var/lib/kubelet/pods/4c4190a9-ad8a-4274-8860-a527e05aa3f5/volumes" Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.118216 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f566fa71-b966-4a19-8843-9d2530fe37a2","Type":"ContainerStarted","Data":"d82d1891f7fa6ec816524489e58c2c811cd2ff1b89d7d4b8dbbc632293ad3194"} Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.118281 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f566fa71-b966-4a19-8843-9d2530fe37a2","Type":"ContainerStarted","Data":"11e1b89b87e8a8bccad116d0ea72e5d8f1b92f92db0a59b197c2bd5cb5f0a84b"} Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.123287 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2799c38e-8a3c-47ca-95c8-c401ce8f5c54","Type":"ContainerStarted","Data":"c0bf8a9e48aade1f0d7a0859996305eb622d343e9dec2910ad3ba40461ade3de"} Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.178176 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.17815214 podStartE2EDuration="9.17815214s" podCreationTimestamp="2025-10-10 08:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:28:26.175639353 +0000 UTC m=+5833.245230614" watchObservedRunningTime="2025-10-10 08:28:26.17815214 +0000 UTC m=+5833.247743381" Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.179095 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.179089065 podStartE2EDuration="2.179089065s" podCreationTimestamp="2025-10-10 08:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:28:26.152416601 +0000 UTC m=+5833.222007862" watchObservedRunningTime="2025-10-10 08:28:26.179089065 +0000 UTC m=+5833.248680296" Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.199163 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.205647 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.298813 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:26 crc kubenswrapper[4732]: I1010 08:28:26.298896 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:27 crc kubenswrapper[4732]: I1010 08:28:27.661107 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:28:27 crc kubenswrapper[4732]: E1010 08:28:27.661839 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.263818 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ss94m"] Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.266240 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.272627 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ss94m"] Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.343265 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.343309 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.371855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-utilities\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.371936 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2gb\" (UniqueName: \"kubernetes.io/projected/117f346c-f835-4009-9745-51db63e56dc0-kube-api-access-kv2gb\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.372023 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-catalog-content\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.439423 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.472822 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.473675 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-utilities\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.473795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2gb\" (UniqueName: \"kubernetes.io/projected/117f346c-f835-4009-9745-51db63e56dc0-kube-api-access-kv2gb\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.473945 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-catalog-content\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.475221 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-utilities\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.475240 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-catalog-content\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.514184 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2gb\" (UniqueName: \"kubernetes.io/projected/117f346c-f835-4009-9745-51db63e56dc0-kube-api-access-kv2gb\") pod \"redhat-operators-ss94m\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:28 crc kubenswrapper[4732]: I1010 08:28:28.636376 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:29 crc kubenswrapper[4732]: I1010 08:28:29.179848 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ss94m"] Oct 10 08:28:29 crc kubenswrapper[4732]: I1010 08:28:29.185489 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 08:28:29 crc kubenswrapper[4732]: I1010 08:28:29.185525 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 10 08:28:30 crc kubenswrapper[4732]: I1010 08:28:30.026124 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-84b1-account-create-llrv9"] Oct 10 08:28:30 crc kubenswrapper[4732]: I1010 08:28:30.034868 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-84b1-account-create-llrv9"] Oct 10 08:28:30 crc kubenswrapper[4732]: I1010 08:28:30.195846 4732 generic.go:334] "Generic (PLEG): container finished" podID="117f346c-f835-4009-9745-51db63e56dc0" containerID="6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20" exitCode=0 Oct 10 08:28:30 crc kubenswrapper[4732]: I1010 08:28:30.195957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss94m" event={"ID":"117f346c-f835-4009-9745-51db63e56dc0","Type":"ContainerDied","Data":"6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20"} Oct 10 08:28:30 crc kubenswrapper[4732]: I1010 08:28:30.196426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss94m" event={"ID":"117f346c-f835-4009-9745-51db63e56dc0","Type":"ContainerStarted","Data":"15caf7e677c2b1980b90f49afa73a5bfd38fb9ec985c96cc0ddf737954fb56d5"} Oct 10 08:28:31 crc kubenswrapper[4732]: I1010 08:28:31.676313 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="391767a0-caac-4153-a01e-34aaebc47b86" path="/var/lib/kubelet/pods/391767a0-caac-4153-a01e-34aaebc47b86/volumes" Oct 10 08:28:31 crc kubenswrapper[4732]: I1010 08:28:31.699162 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 08:28:31 crc kubenswrapper[4732]: I1010 08:28:31.702329 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 10 08:28:32 crc kubenswrapper[4732]: I1010 08:28:32.217326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss94m" event={"ID":"117f346c-f835-4009-9745-51db63e56dc0","Type":"ContainerStarted","Data":"72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc"} Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.244857 4732 generic.go:334] "Generic (PLEG): container finished" podID="117f346c-f835-4009-9745-51db63e56dc0" containerID="72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc" exitCode=0 Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.245241 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss94m" event={"ID":"117f346c-f835-4009-9745-51db63e56dc0","Type":"ContainerDied","Data":"72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc"} Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.340521 4732 scope.go:117] "RemoveContainer" containerID="c6dd858f58c9642fc5947e750c6d8fc01af9df9d9cd71c143ce003e66d641b17" Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.364221 4732 scope.go:117] "RemoveContainer" containerID="5792f40aa29781e9a05d2f0f406ead89f46122612db3918775f9d6e56dc300ae" Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.409598 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.409642 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.442733 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:34 crc kubenswrapper[4732]: I1010 08:28:34.454339 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:35 crc kubenswrapper[4732]: I1010 08:28:35.264170 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:35 crc kubenswrapper[4732]: I1010 08:28:35.264218 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:36 crc kubenswrapper[4732]: I1010 08:28:36.200996 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7c57dc8cd4-tczn7" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.100:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.100:8443: connect: connection refused" Oct 10 08:28:36 crc kubenswrapper[4732]: I1010 08:28:36.275454 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss94m" event={"ID":"117f346c-f835-4009-9745-51db63e56dc0","Type":"ContainerStarted","Data":"c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff"} Oct 10 08:28:36 crc kubenswrapper[4732]: I1010 08:28:36.306818 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ss94m" podStartSLOduration=3.141765995 podStartE2EDuration="8.306794783s" podCreationTimestamp="2025-10-10 08:28:28 +0000 UTC" firstStartedPulling="2025-10-10 08:28:30.197728686 +0000 UTC m=+5837.267319927" lastFinishedPulling="2025-10-10 08:28:35.362757474 +0000 UTC m=+5842.432348715" observedRunningTime="2025-10-10 08:28:36.290556198 +0000 UTC m=+5843.360147459" watchObservedRunningTime="2025-10-10 08:28:36.306794783 +0000 UTC m=+5843.376386024" Oct 10 08:28:36 crc kubenswrapper[4732]: I1010 08:28:36.312924 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5cbd6844cb-rnrwt" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.101:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.101:8443: connect: connection refused" Oct 10 08:28:37 crc kubenswrapper[4732]: I1010 08:28:37.359061 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:37 crc kubenswrapper[4732]: I1010 08:28:37.359469 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 10 08:28:37 crc kubenswrapper[4732]: I1010 08:28:37.509827 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 10 08:28:38 crc kubenswrapper[4732]: I1010 08:28:38.636557 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:38 crc kubenswrapper[4732]: I1010 08:28:38.636613 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:39 crc kubenswrapper[4732]: I1010 08:28:39.660531 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:28:39 crc kubenswrapper[4732]: E1010 08:28:39.661029 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:28:39 crc kubenswrapper[4732]: I1010 08:28:39.688824 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ss94m" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="registry-server" probeResult="failure" output=< Oct 10 08:28:39 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 08:28:39 crc kubenswrapper[4732]: > Oct 10 08:28:40 crc kubenswrapper[4732]: I1010 08:28:40.071498 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gwn4b"] Oct 10 08:28:40 crc kubenswrapper[4732]: I1010 08:28:40.082244 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gwn4b"] Oct 10 08:28:41 crc kubenswrapper[4732]: I1010 08:28:41.671928 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e539916a-97c8-4b30-9422-1b96bb610b3b" path="/var/lib/kubelet/pods/e539916a-97c8-4b30-9422-1b96bb610b3b/volumes" Oct 10 08:28:48 crc kubenswrapper[4732]: I1010 08:28:48.379882 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:48 crc kubenswrapper[4732]: I1010 08:28:48.444054 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:48 crc kubenswrapper[4732]: I1010 08:28:48.693906 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:48 crc kubenswrapper[4732]: I1010 08:28:48.755316 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:48 crc kubenswrapper[4732]: I1010 08:28:48.933969 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ss94m"] Oct 10 08:28:50 crc kubenswrapper[4732]: I1010 08:28:50.094518 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:28:50 crc kubenswrapper[4732]: I1010 08:28:50.137551 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:28:50 crc kubenswrapper[4732]: I1010 08:28:50.219066 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c57dc8cd4-tczn7"] Oct 10 08:28:50 crc kubenswrapper[4732]: I1010 08:28:50.440112 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ss94m" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="registry-server" containerID="cri-o://c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff" gracePeriod=2 Oct 10 08:28:50 crc kubenswrapper[4732]: I1010 08:28:50.440061 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c57dc8cd4-tczn7" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" containerID="cri-o://6a09e58856797a187d74e1f905d57adce1f029974d19d550f716a8403539b9d4" gracePeriod=30 Oct 10 08:28:50 crc kubenswrapper[4732]: I1010 08:28:50.439892 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c57dc8cd4-tczn7" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon-log" containerID="cri-o://b9cf5c9faa7b52514ec956e63496a451122ca7662be50f389e9570422a13d3a1" gracePeriod=30 Oct 10 08:28:50 crc kubenswrapper[4732]: I1010 08:28:50.915907 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.092648 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-catalog-content\") pod \"117f346c-f835-4009-9745-51db63e56dc0\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.092728 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-utilities\") pod \"117f346c-f835-4009-9745-51db63e56dc0\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.092755 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv2gb\" (UniqueName: \"kubernetes.io/projected/117f346c-f835-4009-9745-51db63e56dc0-kube-api-access-kv2gb\") pod \"117f346c-f835-4009-9745-51db63e56dc0\" (UID: \"117f346c-f835-4009-9745-51db63e56dc0\") " Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.093626 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-utilities" (OuterVolumeSpecName: "utilities") pod "117f346c-f835-4009-9745-51db63e56dc0" (UID: "117f346c-f835-4009-9745-51db63e56dc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.105066 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/117f346c-f835-4009-9745-51db63e56dc0-kube-api-access-kv2gb" (OuterVolumeSpecName: "kube-api-access-kv2gb") pod "117f346c-f835-4009-9745-51db63e56dc0" (UID: "117f346c-f835-4009-9745-51db63e56dc0"). InnerVolumeSpecName "kube-api-access-kv2gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.163350 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "117f346c-f835-4009-9745-51db63e56dc0" (UID: "117f346c-f835-4009-9745-51db63e56dc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.194384 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.194415 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/117f346c-f835-4009-9745-51db63e56dc0-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.194426 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv2gb\" (UniqueName: \"kubernetes.io/projected/117f346c-f835-4009-9745-51db63e56dc0-kube-api-access-kv2gb\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.460009 4732 generic.go:334] "Generic (PLEG): container finished" podID="117f346c-f835-4009-9745-51db63e56dc0" containerID="c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff" exitCode=0 Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.460450 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss94m" event={"ID":"117f346c-f835-4009-9745-51db63e56dc0","Type":"ContainerDied","Data":"c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff"} Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.460505 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ss94m" event={"ID":"117f346c-f835-4009-9745-51db63e56dc0","Type":"ContainerDied","Data":"15caf7e677c2b1980b90f49afa73a5bfd38fb9ec985c96cc0ddf737954fb56d5"} Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.460544 4732 scope.go:117] "RemoveContainer" containerID="c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.460847 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ss94m" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.517561 4732 scope.go:117] "RemoveContainer" containerID="72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.549370 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ss94m"] Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.549791 4732 scope.go:117] "RemoveContainer" containerID="6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.565554 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ss94m"] Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.601682 4732 scope.go:117] "RemoveContainer" containerID="c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff" Oct 10 08:28:51 crc kubenswrapper[4732]: E1010 08:28:51.602088 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff\": container with ID starting with c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff not found: ID does not exist" containerID="c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.602134 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff"} err="failed to get container status \"c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff\": rpc error: code = NotFound desc = could not find container \"c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff\": container with ID starting with c604c12a467449e242f8ad3a63d4d4c1c1f190e1df67ce520cb5ee35e77511ff not found: ID does not exist" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.602159 4732 scope.go:117] "RemoveContainer" containerID="72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc" Oct 10 08:28:51 crc kubenswrapper[4732]: E1010 08:28:51.602450 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc\": container with ID starting with 72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc not found: ID does not exist" containerID="72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.602477 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc"} err="failed to get container status \"72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc\": rpc error: code = NotFound desc = could not find container \"72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc\": container with ID starting with 72d3b7ab7ef0f2e907afbb19d9565d6eb38c6443e584ab9731f334cc6e94f9cc not found: ID does not exist" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.602492 4732 scope.go:117] "RemoveContainer" containerID="6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20" Oct 10 08:28:51 crc kubenswrapper[4732]: E1010 08:28:51.602852 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20\": container with ID starting with 6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20 not found: ID does not exist" containerID="6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.602914 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20"} err="failed to get container status \"6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20\": rpc error: code = NotFound desc = could not find container \"6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20\": container with ID starting with 6a45f24529738b7962619e769fb3b2ef2a305fa24f2b82e2d806593cb4633d20 not found: ID does not exist" Oct 10 08:28:51 crc kubenswrapper[4732]: I1010 08:28:51.669843 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="117f346c-f835-4009-9745-51db63e56dc0" path="/var/lib/kubelet/pods/117f346c-f835-4009-9745-51db63e56dc0/volumes" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.410297 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.459195 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-horizon-secret-key\") pod \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.459270 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvgdn\" (UniqueName: \"kubernetes.io/projected/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-kube-api-access-qvgdn\") pod \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.459326 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-config-data\") pod \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.459461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-logs\") pod \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.459518 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-scripts\") pod \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\" (UID: \"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.460378 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-logs" (OuterVolumeSpecName: "logs") pod "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" (UID: "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.460542 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.464343 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" (UID: "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.464516 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-kube-api-access-qvgdn" (OuterVolumeSpecName: "kube-api-access-qvgdn") pod "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" (UID: "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1"). InnerVolumeSpecName "kube-api-access-qvgdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.482898 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-config-data" (OuterVolumeSpecName: "config-data") pod "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" (UID: "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.487426 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-scripts" (OuterVolumeSpecName: "scripts") pod "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" (UID: "4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.491197 4732 generic.go:334] "Generic (PLEG): container finished" podID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerID="82515ccfd4cca81f2923c70265dcf6d1b7bbf47b65ca3cea907fa8f56a673b23" exitCode=137 Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.491304 4732 generic.go:334] "Generic (PLEG): container finished" podID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerID="ba8f1c162fbb30d804227aa342d0d9ae239c6f5ac41fa7e5ec5cec1567319e91" exitCode=137 Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.491257 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c997bdff-lpkdx" event={"ID":"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265","Type":"ContainerDied","Data":"82515ccfd4cca81f2923c70265dcf6d1b7bbf47b65ca3cea907fa8f56a673b23"} Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.491408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c997bdff-lpkdx" event={"ID":"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265","Type":"ContainerDied","Data":"ba8f1c162fbb30d804227aa342d0d9ae239c6f5ac41fa7e5ec5cec1567319e91"} Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.492734 4732 generic.go:334] "Generic (PLEG): container finished" podID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerID="6a09e58856797a187d74e1f905d57adce1f029974d19d550f716a8403539b9d4" exitCode=0 Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.492783 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c57dc8cd4-tczn7" event={"ID":"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405","Type":"ContainerDied","Data":"6a09e58856797a187d74e1f905d57adce1f029974d19d550f716a8403539b9d4"} Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.494164 4732 generic.go:334] "Generic (PLEG): container finished" podID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerID="106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7" exitCode=137 Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.494188 4732 generic.go:334] "Generic (PLEG): container finished" podID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerID="52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d" exitCode=137 Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.494203 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f68475f-2lmjl" event={"ID":"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1","Type":"ContainerDied","Data":"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7"} Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.494223 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f68475f-2lmjl" event={"ID":"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1","Type":"ContainerDied","Data":"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d"} Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.494235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67f68475f-2lmjl" event={"ID":"4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1","Type":"ContainerDied","Data":"5466258295daf42c4bbb84486c5e4a4cf0f41ecf7a1fad53dfde91dc8a98de9c"} Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.494253 4732 scope.go:117] "RemoveContainer" containerID="106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.494389 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67f68475f-2lmjl" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.534739 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67f68475f-2lmjl"] Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.542827 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67f68475f-2lmjl"] Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.562076 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.562109 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.562120 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvgdn\" (UniqueName: \"kubernetes.io/projected/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-kube-api-access-qvgdn\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.562129 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.660244 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:28:54 crc kubenswrapper[4732]: E1010 08:28:54.660763 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.670970 4732 scope.go:117] "RemoveContainer" containerID="52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.719307 4732 scope.go:117] "RemoveContainer" containerID="106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7" Oct 10 08:28:54 crc kubenswrapper[4732]: E1010 08:28:54.719656 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7\": container with ID starting with 106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7 not found: ID does not exist" containerID="106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.719707 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7"} err="failed to get container status \"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7\": rpc error: code = NotFound desc = could not find container \"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7\": container with ID starting with 106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7 not found: ID does not exist" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.719728 4732 scope.go:117] "RemoveContainer" containerID="52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d" Oct 10 08:28:54 crc kubenswrapper[4732]: E1010 08:28:54.720217 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d\": container with ID starting with 52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d not found: ID does not exist" containerID="52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.720249 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d"} err="failed to get container status \"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d\": rpc error: code = NotFound desc = could not find container \"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d\": container with ID starting with 52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d not found: ID does not exist" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.720269 4732 scope.go:117] "RemoveContainer" containerID="106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.720598 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7"} err="failed to get container status \"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7\": rpc error: code = NotFound desc = could not find container \"106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7\": container with ID starting with 106fef0b9c4543851f901abd34b29779b452a3853cdbf1a14b28c3a9f9d8d4c7 not found: ID does not exist" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.720641 4732 scope.go:117] "RemoveContainer" containerID="52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.720958 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d"} err="failed to get container status \"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d\": rpc error: code = NotFound desc = could not find container \"52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d\": container with ID starting with 52057ba79256a897ab140d46f7133f7cd93e02866575328996c569b3c6815a0d not found: ID does not exist" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.938378 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.967249 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-config-data\") pod \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.967470 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-scripts\") pod \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.967496 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-horizon-secret-key\") pod \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.967609 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tffjs\" (UniqueName: \"kubernetes.io/projected/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-kube-api-access-tffjs\") pod \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.967655 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-logs\") pod \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\" (UID: \"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265\") " Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.968024 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-logs" (OuterVolumeSpecName: "logs") pod "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" (UID: "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.968367 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.977839 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" (UID: "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.977882 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-kube-api-access-tffjs" (OuterVolumeSpecName: "kube-api-access-tffjs") pod "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" (UID: "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265"). InnerVolumeSpecName "kube-api-access-tffjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.992049 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-config-data" (OuterVolumeSpecName: "config-data") pod "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" (UID: "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:28:54 crc kubenswrapper[4732]: I1010 08:28:54.999343 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-scripts" (OuterVolumeSpecName: "scripts") pod "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" (UID: "5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.069681 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tffjs\" (UniqueName: \"kubernetes.io/projected/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-kube-api-access-tffjs\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.069718 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.069728 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.069736 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.511587 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c997bdff-lpkdx" event={"ID":"5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265","Type":"ContainerDied","Data":"8269b7bc6ffbcc9488a92348c0c6bd303c19198023930138df4edc8a4b04b6d5"} Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.511674 4732 scope.go:117] "RemoveContainer" containerID="82515ccfd4cca81f2923c70265dcf6d1b7bbf47b65ca3cea907fa8f56a673b23" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.511922 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c997bdff-lpkdx" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.563283 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c997bdff-lpkdx"] Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.573365 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c997bdff-lpkdx"] Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.671265 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" path="/var/lib/kubelet/pods/4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1/volumes" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.672144 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" path="/var/lib/kubelet/pods/5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265/volumes" Oct 10 08:28:55 crc kubenswrapper[4732]: I1010 08:28:55.735001 4732 scope.go:117] "RemoveContainer" containerID="ba8f1c162fbb30d804227aa342d0d9ae239c6f5ac41fa7e5ec5cec1567319e91" Oct 10 08:28:56 crc kubenswrapper[4732]: I1010 08:28:56.206771 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c57dc8cd4-tczn7" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.100:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.100:8443: connect: connection refused" Oct 10 08:29:06 crc kubenswrapper[4732]: I1010 08:29:06.206801 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c57dc8cd4-tczn7" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.100:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.100:8443: connect: connection refused" Oct 10 08:29:06 crc kubenswrapper[4732]: I1010 08:29:06.665255 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:29:06 crc kubenswrapper[4732]: E1010 08:29:06.665710 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:29:16 crc kubenswrapper[4732]: I1010 08:29:16.206651 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7c57dc8cd4-tczn7" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.100:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.100:8443: connect: connection refused" Oct 10 08:29:16 crc kubenswrapper[4732]: I1010 08:29:16.207273 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:29:18 crc kubenswrapper[4732]: I1010 08:29:18.660292 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:29:18 crc kubenswrapper[4732]: E1010 08:29:18.660932 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.786985 4732 generic.go:334] "Generic (PLEG): container finished" podID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerID="b9cf5c9faa7b52514ec956e63496a451122ca7662be50f389e9570422a13d3a1" exitCode=137 Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.787148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c57dc8cd4-tczn7" event={"ID":"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405","Type":"ContainerDied","Data":"b9cf5c9faa7b52514ec956e63496a451122ca7662be50f389e9570422a13d3a1"} Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.896033 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.988454 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-logs\") pod \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.988529 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-scripts\") pod \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.988565 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-secret-key\") pod \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.988669 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9kb\" (UniqueName: \"kubernetes.io/projected/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-kube-api-access-2c9kb\") pod \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.988722 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-tls-certs\") pod \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.988800 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-combined-ca-bundle\") pod \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.988833 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-config-data\") pod \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\" (UID: \"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405\") " Oct 10 08:29:20 crc kubenswrapper[4732]: I1010 08:29:20.990035 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-logs" (OuterVolumeSpecName: "logs") pod "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" (UID: "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:20.995464 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" (UID: "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:20.997901 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-kube-api-access-2c9kb" (OuterVolumeSpecName: "kube-api-access-2c9kb") pod "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" (UID: "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405"). InnerVolumeSpecName "kube-api-access-2c9kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.029830 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-scripts" (OuterVolumeSpecName: "scripts") pod "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" (UID: "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.030272 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" (UID: "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.035008 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-config-data" (OuterVolumeSpecName: "config-data") pod "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" (UID: "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.063229 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" (UID: "4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.090969 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9kb\" (UniqueName: \"kubernetes.io/projected/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-kube-api-access-2c9kb\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.091009 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.091018 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.091027 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.091040 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.091049 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.091061 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.800730 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c57dc8cd4-tczn7" event={"ID":"4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405","Type":"ContainerDied","Data":"135f77760a6dd0e855d359a1f1e844b26963e8443d268244edd5987b2902b182"} Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.801356 4732 scope.go:117] "RemoveContainer" containerID="6a09e58856797a187d74e1f905d57adce1f029974d19d550f716a8403539b9d4" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.800806 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c57dc8cd4-tczn7" Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.846453 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c57dc8cd4-tczn7"] Oct 10 08:29:21 crc kubenswrapper[4732]: I1010 08:29:21.855299 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c57dc8cd4-tczn7"] Oct 10 08:29:22 crc kubenswrapper[4732]: I1010 08:29:22.040529 4732 scope.go:117] "RemoveContainer" containerID="b9cf5c9faa7b52514ec956e63496a451122ca7662be50f389e9570422a13d3a1" Oct 10 08:29:23 crc kubenswrapper[4732]: I1010 08:29:23.677800 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" path="/var/lib/kubelet/pods/4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405/volumes" Oct 10 08:29:29 crc kubenswrapper[4732]: I1010 08:29:29.661020 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:29:29 crc kubenswrapper[4732]: E1010 08:29:29.661879 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.453053 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-567cbfd676-jtwjz"] Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.453919 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.453934 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.453953 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.453962 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.453981 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="extract-utilities" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.453990 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="extract-utilities" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.454001 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454010 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.454023 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="extract-content" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454033 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="extract-content" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.454046 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="registry-server" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454054 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="registry-server" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.454081 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454090 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.454106 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454113 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: E1010 08:29:31.454131 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454139 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454377 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454392 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454402 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfaf4b7-7cc9-40fd-bfb8-d0d8b3e1e265" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454416 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon-log" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454437 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="117f346c-f835-4009-9745-51db63e56dc0" containerName="registry-server" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454455 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfc26fa-a8d0-4dc7-b24e-a9d9460828d1" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.454467 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb6dfe9-b83e-4930-a2b5-7b92c2e4f405" containerName="horizon" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.456570 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.478334 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-567cbfd676-jtwjz"] Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.495380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-horizon-secret-key\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.495447 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d08ea8-b473-4840-9663-9f74ed2cf748-scripts\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.495570 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-combined-ca-bundle\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.495666 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvbfb\" (UniqueName: \"kubernetes.io/projected/19d08ea8-b473-4840-9663-9f74ed2cf748-kube-api-access-dvbfb\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.495740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-horizon-tls-certs\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.495823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19d08ea8-b473-4840-9663-9f74ed2cf748-logs\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.495858 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d08ea8-b473-4840-9663-9f74ed2cf748-config-data\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.597432 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-horizon-secret-key\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.597521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d08ea8-b473-4840-9663-9f74ed2cf748-scripts\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.597565 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-combined-ca-bundle\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.597832 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvbfb\" (UniqueName: \"kubernetes.io/projected/19d08ea8-b473-4840-9663-9f74ed2cf748-kube-api-access-dvbfb\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.597919 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-horizon-tls-certs\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.598004 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19d08ea8-b473-4840-9663-9f74ed2cf748-logs\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.598036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d08ea8-b473-4840-9663-9f74ed2cf748-config-data\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.598612 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19d08ea8-b473-4840-9663-9f74ed2cf748-logs\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.598667 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/19d08ea8-b473-4840-9663-9f74ed2cf748-scripts\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.599487 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/19d08ea8-b473-4840-9663-9f74ed2cf748-config-data\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.604468 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-horizon-secret-key\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.619522 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-horizon-tls-certs\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.619806 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d08ea8-b473-4840-9663-9f74ed2cf748-combined-ca-bundle\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.621484 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvbfb\" (UniqueName: \"kubernetes.io/projected/19d08ea8-b473-4840-9663-9f74ed2cf748-kube-api-access-dvbfb\") pod \"horizon-567cbfd676-jtwjz\" (UID: \"19d08ea8-b473-4840-9663-9f74ed2cf748\") " pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:31 crc kubenswrapper[4732]: I1010 08:29:31.795671 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.247473 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-567cbfd676-jtwjz"] Oct 10 08:29:32 crc kubenswrapper[4732]: W1010 08:29:32.248584 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19d08ea8_b473_4840_9663_9f74ed2cf748.slice/crio-634e89872de76da98e941af7f2879f83c8a7fc3fbbf06a247050ab906c021220 WatchSource:0}: Error finding container 634e89872de76da98e941af7f2879f83c8a7fc3fbbf06a247050ab906c021220: Status 404 returned error can't find the container with id 634e89872de76da98e941af7f2879f83c8a7fc3fbbf06a247050ab906c021220 Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.534821 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-wz54z"] Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.536570 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wz54z" Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.542448 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wz54z"] Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.631211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77zxv\" (UniqueName: \"kubernetes.io/projected/87466246-6e08-4e6d-9dd6-50f57a188992-kube-api-access-77zxv\") pod \"heat-db-create-wz54z\" (UID: \"87466246-6e08-4e6d-9dd6-50f57a188992\") " pod="openstack/heat-db-create-wz54z" Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.733570 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77zxv\" (UniqueName: \"kubernetes.io/projected/87466246-6e08-4e6d-9dd6-50f57a188992-kube-api-access-77zxv\") pod \"heat-db-create-wz54z\" (UID: \"87466246-6e08-4e6d-9dd6-50f57a188992\") " pod="openstack/heat-db-create-wz54z" Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.749068 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77zxv\" (UniqueName: \"kubernetes.io/projected/87466246-6e08-4e6d-9dd6-50f57a188992-kube-api-access-77zxv\") pod \"heat-db-create-wz54z\" (UID: \"87466246-6e08-4e6d-9dd6-50f57a188992\") " pod="openstack/heat-db-create-wz54z" Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.856872 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wz54z" Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.937619 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567cbfd676-jtwjz" event={"ID":"19d08ea8-b473-4840-9663-9f74ed2cf748","Type":"ContainerStarted","Data":"5d0390e7efa7ae5c7144c4fc4d76a7a0932a2d5db98b0491669bcbd8dcc84b7f"} Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.937708 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567cbfd676-jtwjz" event={"ID":"19d08ea8-b473-4840-9663-9f74ed2cf748","Type":"ContainerStarted","Data":"ab3c3dae5b900085d53278936efcc124003e948eb1999b2a25340921cf587c9d"} Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.937721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-567cbfd676-jtwjz" event={"ID":"19d08ea8-b473-4840-9663-9f74ed2cf748","Type":"ContainerStarted","Data":"634e89872de76da98e941af7f2879f83c8a7fc3fbbf06a247050ab906c021220"} Oct 10 08:29:32 crc kubenswrapper[4732]: I1010 08:29:32.973439 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-567cbfd676-jtwjz" podStartSLOduration=1.9734147050000002 podStartE2EDuration="1.973414705s" podCreationTimestamp="2025-10-10 08:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:29:32.968412031 +0000 UTC m=+5900.038003332" watchObservedRunningTime="2025-10-10 08:29:32.973414705 +0000 UTC m=+5900.043005986" Oct 10 08:29:33 crc kubenswrapper[4732]: I1010 08:29:33.340353 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-wz54z"] Oct 10 08:29:33 crc kubenswrapper[4732]: W1010 08:29:33.354225 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87466246_6e08_4e6d_9dd6_50f57a188992.slice/crio-75c9d77d4ba5427a4afb028750981dba53e9a5ca8a07674a33504c1c37f33e93 WatchSource:0}: Error finding container 75c9d77d4ba5427a4afb028750981dba53e9a5ca8a07674a33504c1c37f33e93: Status 404 returned error can't find the container with id 75c9d77d4ba5427a4afb028750981dba53e9a5ca8a07674a33504c1c37f33e93 Oct 10 08:29:33 crc kubenswrapper[4732]: I1010 08:29:33.956853 4732 generic.go:334] "Generic (PLEG): container finished" podID="87466246-6e08-4e6d-9dd6-50f57a188992" containerID="30b23039d67ae40203460f6ebf2c97d7959110e286631e999f57f3efc1a83019" exitCode=0 Oct 10 08:29:33 crc kubenswrapper[4732]: I1010 08:29:33.956970 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wz54z" event={"ID":"87466246-6e08-4e6d-9dd6-50f57a188992","Type":"ContainerDied","Data":"30b23039d67ae40203460f6ebf2c97d7959110e286631e999f57f3efc1a83019"} Oct 10 08:29:33 crc kubenswrapper[4732]: I1010 08:29:33.957300 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wz54z" event={"ID":"87466246-6e08-4e6d-9dd6-50f57a188992","Type":"ContainerStarted","Data":"75c9d77d4ba5427a4afb028750981dba53e9a5ca8a07674a33504c1c37f33e93"} Oct 10 08:29:34 crc kubenswrapper[4732]: I1010 08:29:34.051906 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zcpf6"] Oct 10 08:29:34 crc kubenswrapper[4732]: I1010 08:29:34.072733 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zcpf6"] Oct 10 08:29:34 crc kubenswrapper[4732]: I1010 08:29:34.675299 4732 scope.go:117] "RemoveContainer" containerID="0532bb3ac7d72c03abf990dcab0bab9cf8006d75f64b010a4ca5b97f3457e0fa" Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.355058 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wz54z" Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.392472 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77zxv\" (UniqueName: \"kubernetes.io/projected/87466246-6e08-4e6d-9dd6-50f57a188992-kube-api-access-77zxv\") pod \"87466246-6e08-4e6d-9dd6-50f57a188992\" (UID: \"87466246-6e08-4e6d-9dd6-50f57a188992\") " Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.399105 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87466246-6e08-4e6d-9dd6-50f57a188992-kube-api-access-77zxv" (OuterVolumeSpecName: "kube-api-access-77zxv") pod "87466246-6e08-4e6d-9dd6-50f57a188992" (UID: "87466246-6e08-4e6d-9dd6-50f57a188992"). InnerVolumeSpecName "kube-api-access-77zxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.495314 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77zxv\" (UniqueName: \"kubernetes.io/projected/87466246-6e08-4e6d-9dd6-50f57a188992-kube-api-access-77zxv\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.674783 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb708b8-d5fa-425a-b1b2-919e632bb7b8" path="/var/lib/kubelet/pods/3eb708b8-d5fa-425a-b1b2-919e632bb7b8/volumes" Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.977406 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-wz54z" event={"ID":"87466246-6e08-4e6d-9dd6-50f57a188992","Type":"ContainerDied","Data":"75c9d77d4ba5427a4afb028750981dba53e9a5ca8a07674a33504c1c37f33e93"} Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.977440 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75c9d77d4ba5427a4afb028750981dba53e9a5ca8a07674a33504c1c37f33e93" Oct 10 08:29:35 crc kubenswrapper[4732]: I1010 08:29:35.977454 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-wz54z" Oct 10 08:29:41 crc kubenswrapper[4732]: I1010 08:29:41.796140 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:41 crc kubenswrapper[4732]: I1010 08:29:41.796501 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.630232 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-2708-account-create-rtsmq"] Oct 10 08:29:42 crc kubenswrapper[4732]: E1010 08:29:42.632370 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87466246-6e08-4e6d-9dd6-50f57a188992" containerName="mariadb-database-create" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.632400 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="87466246-6e08-4e6d-9dd6-50f57a188992" containerName="mariadb-database-create" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.632712 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="87466246-6e08-4e6d-9dd6-50f57a188992" containerName="mariadb-database-create" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.633477 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2708-account-create-rtsmq" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.636305 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.644439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442sg\" (UniqueName: \"kubernetes.io/projected/f199b1c8-bd78-4dda-b8ef-fe655bfedee1-kube-api-access-442sg\") pod \"heat-2708-account-create-rtsmq\" (UID: \"f199b1c8-bd78-4dda-b8ef-fe655bfedee1\") " pod="openstack/heat-2708-account-create-rtsmq" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.649931 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2708-account-create-rtsmq"] Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.747495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442sg\" (UniqueName: \"kubernetes.io/projected/f199b1c8-bd78-4dda-b8ef-fe655bfedee1-kube-api-access-442sg\") pod \"heat-2708-account-create-rtsmq\" (UID: \"f199b1c8-bd78-4dda-b8ef-fe655bfedee1\") " pod="openstack/heat-2708-account-create-rtsmq" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.767746 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442sg\" (UniqueName: \"kubernetes.io/projected/f199b1c8-bd78-4dda-b8ef-fe655bfedee1-kube-api-access-442sg\") pod \"heat-2708-account-create-rtsmq\" (UID: \"f199b1c8-bd78-4dda-b8ef-fe655bfedee1\") " pod="openstack/heat-2708-account-create-rtsmq" Oct 10 08:29:42 crc kubenswrapper[4732]: I1010 08:29:42.968787 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2708-account-create-rtsmq" Oct 10 08:29:43 crc kubenswrapper[4732]: I1010 08:29:43.370849 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-2708-account-create-rtsmq"] Oct 10 08:29:44 crc kubenswrapper[4732]: I1010 08:29:44.079839 4732 generic.go:334] "Generic (PLEG): container finished" podID="f199b1c8-bd78-4dda-b8ef-fe655bfedee1" containerID="d606e88eff500763c03a15e62df988e8bc219a9846be9e921fbfa4be6ff6ea9e" exitCode=0 Oct 10 08:29:44 crc kubenswrapper[4732]: I1010 08:29:44.079896 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2708-account-create-rtsmq" event={"ID":"f199b1c8-bd78-4dda-b8ef-fe655bfedee1","Type":"ContainerDied","Data":"d606e88eff500763c03a15e62df988e8bc219a9846be9e921fbfa4be6ff6ea9e"} Oct 10 08:29:44 crc kubenswrapper[4732]: I1010 08:29:44.079935 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2708-account-create-rtsmq" event={"ID":"f199b1c8-bd78-4dda-b8ef-fe655bfedee1","Type":"ContainerStarted","Data":"36a23b51ebcce8ced6f6bff7b14851cb37b4ff60b848d227ecdfaa27bd91a967"} Oct 10 08:29:44 crc kubenswrapper[4732]: I1010 08:29:44.660983 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:29:44 crc kubenswrapper[4732]: E1010 08:29:44.661565 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:29:45 crc kubenswrapper[4732]: I1010 08:29:45.038579 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8ec8-account-create-46pz2"] Oct 10 08:29:45 crc kubenswrapper[4732]: I1010 08:29:45.046839 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8ec8-account-create-46pz2"] Oct 10 08:29:45 crc kubenswrapper[4732]: I1010 08:29:45.425821 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2708-account-create-rtsmq" Oct 10 08:29:45 crc kubenswrapper[4732]: I1010 08:29:45.606435 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-442sg\" (UniqueName: \"kubernetes.io/projected/f199b1c8-bd78-4dda-b8ef-fe655bfedee1-kube-api-access-442sg\") pod \"f199b1c8-bd78-4dda-b8ef-fe655bfedee1\" (UID: \"f199b1c8-bd78-4dda-b8ef-fe655bfedee1\") " Oct 10 08:29:45 crc kubenswrapper[4732]: I1010 08:29:45.619935 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f199b1c8-bd78-4dda-b8ef-fe655bfedee1-kube-api-access-442sg" (OuterVolumeSpecName: "kube-api-access-442sg") pod "f199b1c8-bd78-4dda-b8ef-fe655bfedee1" (UID: "f199b1c8-bd78-4dda-b8ef-fe655bfedee1"). InnerVolumeSpecName "kube-api-access-442sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:29:45 crc kubenswrapper[4732]: I1010 08:29:45.679384 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add98675-a136-495e-ac8d-45049b57e51b" path="/var/lib/kubelet/pods/add98675-a136-495e-ac8d-45049b57e51b/volumes" Oct 10 08:29:45 crc kubenswrapper[4732]: I1010 08:29:45.709849 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-442sg\" (UniqueName: \"kubernetes.io/projected/f199b1c8-bd78-4dda-b8ef-fe655bfedee1-kube-api-access-442sg\") on node \"crc\" DevicePath \"\"" Oct 10 08:29:46 crc kubenswrapper[4732]: I1010 08:29:46.101016 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-2708-account-create-rtsmq" event={"ID":"f199b1c8-bd78-4dda-b8ef-fe655bfedee1","Type":"ContainerDied","Data":"36a23b51ebcce8ced6f6bff7b14851cb37b4ff60b848d227ecdfaa27bd91a967"} Oct 10 08:29:46 crc kubenswrapper[4732]: I1010 08:29:46.101069 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36a23b51ebcce8ced6f6bff7b14851cb37b4ff60b848d227ecdfaa27bd91a967" Oct 10 08:29:46 crc kubenswrapper[4732]: I1010 08:29:46.101184 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-2708-account-create-rtsmq" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.687879 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-9c2bk"] Oct 10 08:29:47 crc kubenswrapper[4732]: E1010 08:29:47.688658 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f199b1c8-bd78-4dda-b8ef-fe655bfedee1" containerName="mariadb-account-create" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.688674 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f199b1c8-bd78-4dda-b8ef-fe655bfedee1" containerName="mariadb-account-create" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.688959 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f199b1c8-bd78-4dda-b8ef-fe655bfedee1" containerName="mariadb-account-create" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.689803 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.692766 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2kcz6" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.700543 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.730428 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9c2bk"] Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.754844 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvxqx\" (UniqueName: \"kubernetes.io/projected/0b0a935b-06d8-47e6-856f-d7f9b048d366-kube-api-access-zvxqx\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.755257 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-combined-ca-bundle\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.755384 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-config-data\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.857213 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-combined-ca-bundle\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.857279 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-config-data\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.857386 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvxqx\" (UniqueName: \"kubernetes.io/projected/0b0a935b-06d8-47e6-856f-d7f9b048d366-kube-api-access-zvxqx\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.864236 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-combined-ca-bundle\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.872973 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvxqx\" (UniqueName: \"kubernetes.io/projected/0b0a935b-06d8-47e6-856f-d7f9b048d366-kube-api-access-zvxqx\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:47 crc kubenswrapper[4732]: I1010 08:29:47.873146 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-config-data\") pod \"heat-db-sync-9c2bk\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:48 crc kubenswrapper[4732]: I1010 08:29:48.025219 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9c2bk" Oct 10 08:29:48 crc kubenswrapper[4732]: I1010 08:29:48.594160 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-9c2bk"] Oct 10 08:29:49 crc kubenswrapper[4732]: I1010 08:29:49.140410 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9c2bk" event={"ID":"0b0a935b-06d8-47e6-856f-d7f9b048d366","Type":"ContainerStarted","Data":"5bd22e8e80f86d485104659d694d3618cd6c23f61d8d3498cf6d14cef18c2309"} Oct 10 08:29:53 crc kubenswrapper[4732]: I1010 08:29:53.029085 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qgnk9"] Oct 10 08:29:53 crc kubenswrapper[4732]: I1010 08:29:53.042777 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qgnk9"] Oct 10 08:29:53 crc kubenswrapper[4732]: I1010 08:29:53.689814 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b458ddfb-e4b7-4348-bf92-57cfc0a37076" path="/var/lib/kubelet/pods/b458ddfb-e4b7-4348-bf92-57cfc0a37076/volumes" Oct 10 08:29:53 crc kubenswrapper[4732]: I1010 08:29:53.690554 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:55 crc kubenswrapper[4732]: I1010 08:29:55.412181 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-567cbfd676-jtwjz" Oct 10 08:29:55 crc kubenswrapper[4732]: I1010 08:29:55.491138 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cbd6844cb-rnrwt"] Oct 10 08:29:55 crc kubenswrapper[4732]: I1010 08:29:55.491365 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cbd6844cb-rnrwt" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon-log" containerID="cri-o://a39a3119019264e0284f247dfa899bd69435ff229af03952c369ca4a788c539e" gracePeriod=30 Oct 10 08:29:55 crc kubenswrapper[4732]: I1010 08:29:55.491820 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cbd6844cb-rnrwt" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" containerID="cri-o://52ea17f600f69a171aaaaa793ba0984c355be42198e0ebdf848e2a5cefd69ab6" gracePeriod=30 Oct 10 08:29:57 crc kubenswrapper[4732]: I1010 08:29:57.215071 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9c2bk" event={"ID":"0b0a935b-06d8-47e6-856f-d7f9b048d366","Type":"ContainerStarted","Data":"cf19aa260e8267381679e3212c8ba4b531c4ed7c945c53b796baef22cdff1558"} Oct 10 08:29:57 crc kubenswrapper[4732]: I1010 08:29:57.239794 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-9c2bk" podStartSLOduration=2.827918323 podStartE2EDuration="10.239776516s" podCreationTimestamp="2025-10-10 08:29:47 +0000 UTC" firstStartedPulling="2025-10-10 08:29:48.600589747 +0000 UTC m=+5915.670180998" lastFinishedPulling="2025-10-10 08:29:56.01244793 +0000 UTC m=+5923.082039191" observedRunningTime="2025-10-10 08:29:57.232653565 +0000 UTC m=+5924.302244806" watchObservedRunningTime="2025-10-10 08:29:57.239776516 +0000 UTC m=+5924.309367757" Oct 10 08:29:58 crc kubenswrapper[4732]: I1010 08:29:58.656737 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cbd6844cb-rnrwt" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.101:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:40704->10.217.1.101:8443: read: connection reset by peer" Oct 10 08:29:59 crc kubenswrapper[4732]: I1010 08:29:59.243437 4732 generic.go:334] "Generic (PLEG): container finished" podID="0b0a935b-06d8-47e6-856f-d7f9b048d366" containerID="cf19aa260e8267381679e3212c8ba4b531c4ed7c945c53b796baef22cdff1558" exitCode=0 Oct 10 08:29:59 crc kubenswrapper[4732]: I1010 08:29:59.243533 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9c2bk" event={"ID":"0b0a935b-06d8-47e6-856f-d7f9b048d366","Type":"ContainerDied","Data":"cf19aa260e8267381679e3212c8ba4b531c4ed7c945c53b796baef22cdff1558"} Oct 10 08:29:59 crc kubenswrapper[4732]: I1010 08:29:59.246541 4732 generic.go:334] "Generic (PLEG): container finished" podID="f934d435-c311-497b-9298-43a3f48e717f" containerID="52ea17f600f69a171aaaaa793ba0984c355be42198e0ebdf848e2a5cefd69ab6" exitCode=0 Oct 10 08:29:59 crc kubenswrapper[4732]: I1010 08:29:59.246580 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbd6844cb-rnrwt" event={"ID":"f934d435-c311-497b-9298-43a3f48e717f","Type":"ContainerDied","Data":"52ea17f600f69a171aaaaa793ba0984c355be42198e0ebdf848e2a5cefd69ab6"} Oct 10 08:29:59 crc kubenswrapper[4732]: I1010 08:29:59.661102 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.203099 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z"] Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.206783 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.209381 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.209565 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.219246 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxtnf\" (UniqueName: \"kubernetes.io/projected/77b1c333-44ac-4cb6-bac5-598107d56e7b-kube-api-access-nxtnf\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.219431 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b1c333-44ac-4cb6-bac5-598107d56e7b-config-volume\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.219528 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b1c333-44ac-4cb6-bac5-598107d56e7b-secret-volume\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.225973 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z"] Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.258614 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"bf642036695cfd106f68d43cb38c2eaa4d9337092959ec0ecee4949724bd8c41"} Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.321379 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxtnf\" (UniqueName: \"kubernetes.io/projected/77b1c333-44ac-4cb6-bac5-598107d56e7b-kube-api-access-nxtnf\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.321463 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b1c333-44ac-4cb6-bac5-598107d56e7b-config-volume\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.321493 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b1c333-44ac-4cb6-bac5-598107d56e7b-secret-volume\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.323133 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b1c333-44ac-4cb6-bac5-598107d56e7b-config-volume\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.335374 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b1c333-44ac-4cb6-bac5-598107d56e7b-secret-volume\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.339424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxtnf\" (UniqueName: \"kubernetes.io/projected/77b1c333-44ac-4cb6-bac5-598107d56e7b-kube-api-access-nxtnf\") pod \"collect-profiles-29334750-9hz9z\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.534342 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.638805 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9c2bk" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.835547 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-config-data\") pod \"0b0a935b-06d8-47e6-856f-d7f9b048d366\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.836005 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-combined-ca-bundle\") pod \"0b0a935b-06d8-47e6-856f-d7f9b048d366\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.836122 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvxqx\" (UniqueName: \"kubernetes.io/projected/0b0a935b-06d8-47e6-856f-d7f9b048d366-kube-api-access-zvxqx\") pod \"0b0a935b-06d8-47e6-856f-d7f9b048d366\" (UID: \"0b0a935b-06d8-47e6-856f-d7f9b048d366\") " Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.843312 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b0a935b-06d8-47e6-856f-d7f9b048d366-kube-api-access-zvxqx" (OuterVolumeSpecName: "kube-api-access-zvxqx") pod "0b0a935b-06d8-47e6-856f-d7f9b048d366" (UID: "0b0a935b-06d8-47e6-856f-d7f9b048d366"). InnerVolumeSpecName "kube-api-access-zvxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.869267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b0a935b-06d8-47e6-856f-d7f9b048d366" (UID: "0b0a935b-06d8-47e6-856f-d7f9b048d366"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.907672 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-config-data" (OuterVolumeSpecName: "config-data") pod "0b0a935b-06d8-47e6-856f-d7f9b048d366" (UID: "0b0a935b-06d8-47e6-856f-d7f9b048d366"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.938680 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.938735 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b0a935b-06d8-47e6-856f-d7f9b048d366-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.938753 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvxqx\" (UniqueName: \"kubernetes.io/projected/0b0a935b-06d8-47e6-856f-d7f9b048d366-kube-api-access-zvxqx\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:00 crc kubenswrapper[4732]: I1010 08:30:00.985805 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z"] Oct 10 08:30:01 crc kubenswrapper[4732]: I1010 08:30:01.268543 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-9c2bk" event={"ID":"0b0a935b-06d8-47e6-856f-d7f9b048d366","Type":"ContainerDied","Data":"5bd22e8e80f86d485104659d694d3618cd6c23f61d8d3498cf6d14cef18c2309"} Oct 10 08:30:01 crc kubenswrapper[4732]: I1010 08:30:01.268909 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bd22e8e80f86d485104659d694d3618cd6c23f61d8d3498cf6d14cef18c2309" Oct 10 08:30:01 crc kubenswrapper[4732]: I1010 08:30:01.268555 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-9c2bk" Oct 10 08:30:01 crc kubenswrapper[4732]: I1010 08:30:01.270225 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" event={"ID":"77b1c333-44ac-4cb6-bac5-598107d56e7b","Type":"ContainerStarted","Data":"d2e37ba17e770666d7f7da5a1690c35585862e459b2fb04341396494a59302ee"} Oct 10 08:30:01 crc kubenswrapper[4732]: I1010 08:30:01.270254 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" event={"ID":"77b1c333-44ac-4cb6-bac5-598107d56e7b","Type":"ContainerStarted","Data":"5ca23acdeac53172b7d2eac5f2fb2a4e5f378d637ae08fe231e97b724086d23a"} Oct 10 08:30:01 crc kubenswrapper[4732]: I1010 08:30:01.309879 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" podStartSLOduration=1.309838892 podStartE2EDuration="1.309838892s" podCreationTimestamp="2025-10-10 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:30:01.291792149 +0000 UTC m=+5928.361383410" watchObservedRunningTime="2025-10-10 08:30:01.309838892 +0000 UTC m=+5928.379430123" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.250125 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-d6b8bdf8b-bkd26"] Oct 10 08:30:02 crc kubenswrapper[4732]: E1010 08:30:02.250843 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b0a935b-06d8-47e6-856f-d7f9b048d366" containerName="heat-db-sync" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.250857 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b0a935b-06d8-47e6-856f-d7f9b048d366" containerName="heat-db-sync" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.251030 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b0a935b-06d8-47e6-856f-d7f9b048d366" containerName="heat-db-sync" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.251677 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.258319 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.265730 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2kcz6" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.265915 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.281802 4732 generic.go:334] "Generic (PLEG): container finished" podID="77b1c333-44ac-4cb6-bac5-598107d56e7b" containerID="d2e37ba17e770666d7f7da5a1690c35585862e459b2fb04341396494a59302ee" exitCode=0 Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.281843 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" event={"ID":"77b1c333-44ac-4cb6-bac5-598107d56e7b","Type":"ContainerDied","Data":"d2e37ba17e770666d7f7da5a1690c35585862e459b2fb04341396494a59302ee"} Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.298921 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-d6b8bdf8b-bkd26"] Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.350820 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-799877d5cf-8dwqx"] Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.351921 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.357113 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.363484 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2ftp\" (UniqueName: \"kubernetes.io/projected/922f5d41-7b29-4466-a199-00ac9ff5a424-kube-api-access-j2ftp\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.363529 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.363571 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-combined-ca-bundle\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.363681 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data-custom\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.387789 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-799877d5cf-8dwqx"] Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.424208 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6d7cd5f486-p94xv"] Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.426344 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.431276 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.437512 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6d7cd5f486-p94xv"] Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473062 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6ddc\" (UniqueName: \"kubernetes.io/projected/25da90c2-6c96-45d4-9a99-ae6ba7b76927-kube-api-access-q6ddc\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473152 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473420 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-combined-ca-bundle\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473486 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data-custom\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473534 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data-custom\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473557 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2ftp\" (UniqueName: \"kubernetes.io/projected/922f5d41-7b29-4466-a199-00ac9ff5a424-kube-api-access-j2ftp\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473581 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.473613 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-combined-ca-bundle\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.480382 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-combined-ca-bundle\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.482190 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.484463 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data-custom\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.492207 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2ftp\" (UniqueName: \"kubernetes.io/projected/922f5d41-7b29-4466-a199-00ac9ff5a424-kube-api-access-j2ftp\") pod \"heat-engine-d6b8bdf8b-bkd26\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.575458 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587190 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-combined-ca-bundle\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587305 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data-custom\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587380 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data-custom\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587490 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587526 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6ddc\" (UniqueName: \"kubernetes.io/projected/25da90c2-6c96-45d4-9a99-ae6ba7b76927-kube-api-access-q6ddc\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587574 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-combined-ca-bundle\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587623 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cfj4\" (UniqueName: \"kubernetes.io/projected/a25c6360-5e71-407f-8d5b-100bdcfd71e3-kube-api-access-7cfj4\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.587676 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.600518 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data-custom\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.613143 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-combined-ca-bundle\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.617070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.631796 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6ddc\" (UniqueName: \"kubernetes.io/projected/25da90c2-6c96-45d4-9a99-ae6ba7b76927-kube-api-access-q6ddc\") pod \"heat-api-799877d5cf-8dwqx\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.673401 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.691796 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-combined-ca-bundle\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.691908 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data-custom\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.692098 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cfj4\" (UniqueName: \"kubernetes.io/projected/a25c6360-5e71-407f-8d5b-100bdcfd71e3-kube-api-access-7cfj4\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.692141 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.697142 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-combined-ca-bundle\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.706823 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data-custom\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.708646 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.730201 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cfj4\" (UniqueName: \"kubernetes.io/projected/a25c6360-5e71-407f-8d5b-100bdcfd71e3-kube-api-access-7cfj4\") pod \"heat-cfnapi-6d7cd5f486-p94xv\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:02 crc kubenswrapper[4732]: I1010 08:30:02.755096 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.091527 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-d6b8bdf8b-bkd26"] Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.248071 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6d7cd5f486-p94xv"] Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.264719 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-799877d5cf-8dwqx"] Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.301494 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" event={"ID":"a25c6360-5e71-407f-8d5b-100bdcfd71e3","Type":"ContainerStarted","Data":"eb620c3d3085aa65a8a695f27f762217a310a8a5a62886cb7cf6258ef978c238"} Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.306031 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d6b8bdf8b-bkd26" event={"ID":"922f5d41-7b29-4466-a199-00ac9ff5a424","Type":"ContainerStarted","Data":"e5ca578ca47b4cf1daf5ba86389511463716ca4eff6493f76052f5d12acef155"} Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.731726 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.913219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b1c333-44ac-4cb6-bac5-598107d56e7b-config-volume\") pod \"77b1c333-44ac-4cb6-bac5-598107d56e7b\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.914529 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77b1c333-44ac-4cb6-bac5-598107d56e7b-config-volume" (OuterVolumeSpecName: "config-volume") pod "77b1c333-44ac-4cb6-bac5-598107d56e7b" (UID: "77b1c333-44ac-4cb6-bac5-598107d56e7b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.915536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxtnf\" (UniqueName: \"kubernetes.io/projected/77b1c333-44ac-4cb6-bac5-598107d56e7b-kube-api-access-nxtnf\") pod \"77b1c333-44ac-4cb6-bac5-598107d56e7b\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.915592 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b1c333-44ac-4cb6-bac5-598107d56e7b-secret-volume\") pod \"77b1c333-44ac-4cb6-bac5-598107d56e7b\" (UID: \"77b1c333-44ac-4cb6-bac5-598107d56e7b\") " Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.916307 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77b1c333-44ac-4cb6-bac5-598107d56e7b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.920924 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77b1c333-44ac-4cb6-bac5-598107d56e7b-kube-api-access-nxtnf" (OuterVolumeSpecName: "kube-api-access-nxtnf") pod "77b1c333-44ac-4cb6-bac5-598107d56e7b" (UID: "77b1c333-44ac-4cb6-bac5-598107d56e7b"). InnerVolumeSpecName "kube-api-access-nxtnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:03 crc kubenswrapper[4732]: I1010 08:30:03.934806 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77b1c333-44ac-4cb6-bac5-598107d56e7b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77b1c333-44ac-4cb6-bac5-598107d56e7b" (UID: "77b1c333-44ac-4cb6-bac5-598107d56e7b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.018361 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxtnf\" (UniqueName: \"kubernetes.io/projected/77b1c333-44ac-4cb6-bac5-598107d56e7b-kube-api-access-nxtnf\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.018406 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77b1c333-44ac-4cb6-bac5-598107d56e7b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.329398 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d6b8bdf8b-bkd26" event={"ID":"922f5d41-7b29-4466-a199-00ac9ff5a424","Type":"ContainerStarted","Data":"df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2"} Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.329861 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.340879 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.342155 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z" event={"ID":"77b1c333-44ac-4cb6-bac5-598107d56e7b","Type":"ContainerDied","Data":"5ca23acdeac53172b7d2eac5f2fb2a4e5f378d637ae08fe231e97b724086d23a"} Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.342197 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca23acdeac53172b7d2eac5f2fb2a4e5f378d637ae08fe231e97b724086d23a" Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.357546 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-799877d5cf-8dwqx" event={"ID":"25da90c2-6c96-45d4-9a99-ae6ba7b76927","Type":"ContainerStarted","Data":"d6b9a3f0acde60e4b1b9162f36b92a2016550fabb709f6e074feb1d6a8b0f5cf"} Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.364927 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-d6b8bdf8b-bkd26" podStartSLOduration=2.36490375 podStartE2EDuration="2.36490375s" podCreationTimestamp="2025-10-10 08:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:30:04.355346344 +0000 UTC m=+5931.424937595" watchObservedRunningTime="2025-10-10 08:30:04.36490375 +0000 UTC m=+5931.434494991" Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.377843 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4"] Oct 10 08:30:04 crc kubenswrapper[4732]: I1010 08:30:04.388599 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334705-6vwz4"] Oct 10 08:30:05 crc kubenswrapper[4732]: I1010 08:30:05.671611 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b467f54-105b-43d2-ac29-0d3e6cfd993d" path="/var/lib/kubelet/pods/4b467f54-105b-43d2-ac29-0d3e6cfd993d/volumes" Oct 10 08:30:06 crc kubenswrapper[4732]: I1010 08:30:06.299790 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cbd6844cb-rnrwt" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.101:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.101:8443: connect: connection refused" Oct 10 08:30:06 crc kubenswrapper[4732]: I1010 08:30:06.379497 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-799877d5cf-8dwqx" event={"ID":"25da90c2-6c96-45d4-9a99-ae6ba7b76927","Type":"ContainerStarted","Data":"18053e77da6d524166a649a2782a3d37461020a743b795ff69e979c47bc3bb93"} Oct 10 08:30:06 crc kubenswrapper[4732]: I1010 08:30:06.380398 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:06 crc kubenswrapper[4732]: I1010 08:30:06.382307 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" event={"ID":"a25c6360-5e71-407f-8d5b-100bdcfd71e3","Type":"ContainerStarted","Data":"4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f"} Oct 10 08:30:06 crc kubenswrapper[4732]: I1010 08:30:06.383207 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:06 crc kubenswrapper[4732]: I1010 08:30:06.410371 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-799877d5cf-8dwqx" podStartSLOduration=2.459432381 podStartE2EDuration="4.410322701s" podCreationTimestamp="2025-10-10 08:30:02 +0000 UTC" firstStartedPulling="2025-10-10 08:30:03.292701948 +0000 UTC m=+5930.362293189" lastFinishedPulling="2025-10-10 08:30:05.243592268 +0000 UTC m=+5932.313183509" observedRunningTime="2025-10-10 08:30:06.400116868 +0000 UTC m=+5933.469708109" watchObservedRunningTime="2025-10-10 08:30:06.410322701 +0000 UTC m=+5933.479913952" Oct 10 08:30:06 crc kubenswrapper[4732]: I1010 08:30:06.419622 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" podStartSLOduration=2.454174421 podStartE2EDuration="4.41960235s" podCreationTimestamp="2025-10-10 08:30:02 +0000 UTC" firstStartedPulling="2025-10-10 08:30:03.279578457 +0000 UTC m=+5930.349169698" lastFinishedPulling="2025-10-10 08:30:05.245006386 +0000 UTC m=+5932.314597627" observedRunningTime="2025-10-10 08:30:06.418178491 +0000 UTC m=+5933.487769732" watchObservedRunningTime="2025-10-10 08:30:06.41960235 +0000 UTC m=+5933.489193591" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.431188 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5844bd7ddb-sds75"] Oct 10 08:30:09 crc kubenswrapper[4732]: E1010 08:30:09.432018 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77b1c333-44ac-4cb6-bac5-598107d56e7b" containerName="collect-profiles" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.432038 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="77b1c333-44ac-4cb6-bac5-598107d56e7b" containerName="collect-profiles" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.432302 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="77b1c333-44ac-4cb6-bac5-598107d56e7b" containerName="collect-profiles" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.433939 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.445542 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5844bd7ddb-sds75"] Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.459990 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-cc954bc69-kbqfk"] Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.461589 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.472390 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-56b56778f9-lhkw6"] Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.474080 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.483920 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-cc954bc69-kbqfk"] Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.516236 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56b56778f9-lhkw6"] Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.546080 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.546192 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-combined-ca-bundle\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.546227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-combined-ca-bundle\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.546257 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqtq\" (UniqueName: \"kubernetes.io/projected/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-kube-api-access-ctqtq\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.546366 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-combined-ca-bundle\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.546445 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-config-data\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.546722 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data-custom\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.547679 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59k54\" (UniqueName: \"kubernetes.io/projected/96365079-5a8a-4b2b-87f0-502a7b09ed3c-kube-api-access-59k54\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.547786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgbwt\" (UniqueName: \"kubernetes.io/projected/e774fac5-92b3-4c5b-be38-0499157c9194-kube-api-access-jgbwt\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.547828 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.547973 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data-custom\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.548306 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-config-data-custom\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649300 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data-custom\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649688 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-config-data-custom\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649802 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649851 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-combined-ca-bundle\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-combined-ca-bundle\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649899 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqtq\" (UniqueName: \"kubernetes.io/projected/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-kube-api-access-ctqtq\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-combined-ca-bundle\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.649958 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-config-data\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.650002 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data-custom\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.650035 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59k54\" (UniqueName: \"kubernetes.io/projected/96365079-5a8a-4b2b-87f0-502a7b09ed3c-kube-api-access-59k54\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.650064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgbwt\" (UniqueName: \"kubernetes.io/projected/e774fac5-92b3-4c5b-be38-0499157c9194-kube-api-access-jgbwt\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.650087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.658188 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data-custom\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.666476 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-config-data-custom\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.666719 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-combined-ca-bundle\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.672121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgbwt\" (UniqueName: \"kubernetes.io/projected/e774fac5-92b3-4c5b-be38-0499157c9194-kube-api-access-jgbwt\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.673179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.673370 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-config-data\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.673421 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96365079-5a8a-4b2b-87f0-502a7b09ed3c-combined-ca-bundle\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.675551 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.675681 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59k54\" (UniqueName: \"kubernetes.io/projected/96365079-5a8a-4b2b-87f0-502a7b09ed3c-kube-api-access-59k54\") pod \"heat-engine-5844bd7ddb-sds75\" (UID: \"96365079-5a8a-4b2b-87f0-502a7b09ed3c\") " pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.676576 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-combined-ca-bundle\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.676860 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqtq\" (UniqueName: \"kubernetes.io/projected/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-kube-api-access-ctqtq\") pod \"heat-api-56b56778f9-lhkw6\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.677745 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data-custom\") pod \"heat-cfnapi-cc954bc69-kbqfk\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.755130 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.790058 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:09 crc kubenswrapper[4732]: I1010 08:30:09.823522 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.254126 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5844bd7ddb-sds75"] Oct 10 08:30:11 crc kubenswrapper[4732]: W1010 08:30:10.256478 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96365079_5a8a_4b2b_87f0_502a7b09ed3c.slice/crio-d554866898a64624f81426d365520d0f7dbe42c64cc5c26046f3764130e517ca WatchSource:0}: Error finding container d554866898a64624f81426d365520d0f7dbe42c64cc5c26046f3764130e517ca: Status 404 returned error can't find the container with id d554866898a64624f81426d365520d0f7dbe42c64cc5c26046f3764130e517ca Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.364299 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-cc954bc69-kbqfk"] Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.373391 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-56b56778f9-lhkw6"] Oct 10 08:30:11 crc kubenswrapper[4732]: W1010 08:30:10.403212 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode774fac5_92b3_4c5b_be38_0499157c9194.slice/crio-b63ac20417728eb29d8f5bd785b0613ce172467b0157cb6dbc9ed77ba86152af WatchSource:0}: Error finding container b63ac20417728eb29d8f5bd785b0613ce172467b0157cb6dbc9ed77ba86152af: Status 404 returned error can't find the container with id b63ac20417728eb29d8f5bd785b0613ce172467b0157cb6dbc9ed77ba86152af Oct 10 08:30:11 crc kubenswrapper[4732]: W1010 08:30:10.403666 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1f6e5e_2a3c_44f6_a027_76a73c54085a.slice/crio-50585a3d2d9211e04c9b5c39466460996d1d7260529cf1aa18ea83ab3b692202 WatchSource:0}: Error finding container 50585a3d2d9211e04c9b5c39466460996d1d7260529cf1aa18ea83ab3b692202: Status 404 returned error can't find the container with id 50585a3d2d9211e04c9b5c39466460996d1d7260529cf1aa18ea83ab3b692202 Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.428803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" event={"ID":"e774fac5-92b3-4c5b-be38-0499157c9194","Type":"ContainerStarted","Data":"b63ac20417728eb29d8f5bd785b0613ce172467b0157cb6dbc9ed77ba86152af"} Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.430395 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56b56778f9-lhkw6" event={"ID":"1d1f6e5e-2a3c-44f6-a027-76a73c54085a","Type":"ContainerStarted","Data":"50585a3d2d9211e04c9b5c39466460996d1d7260529cf1aa18ea83ab3b692202"} Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.435389 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5844bd7ddb-sds75" event={"ID":"96365079-5a8a-4b2b-87f0-502a7b09ed3c","Type":"ContainerStarted","Data":"d554866898a64624f81426d365520d0f7dbe42c64cc5c26046f3764130e517ca"} Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.668205 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-799877d5cf-8dwqx"] Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.669109 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-799877d5cf-8dwqx" podUID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" containerName="heat-api" containerID="cri-o://18053e77da6d524166a649a2782a3d37461020a743b795ff69e979c47bc3bb93" gracePeriod=60 Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.689045 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-799877d5cf-8dwqx" podUID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.111:8004/healthcheck\": EOF" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.725250 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6d7cd5f486-p94xv"] Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.725447 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" podUID="a25c6360-5e71-407f-8d5b-100bdcfd71e3" containerName="heat-cfnapi" containerID="cri-o://4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f" gracePeriod=60 Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.744266 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-575d45d5d7-xbm4f"] Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.745486 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.756163 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.757542 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.768043 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-575d45d5d7-xbm4f"] Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.779791 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7dbb656958-mt7c9"] Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.781204 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.787221 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.787249 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.789558 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dbb656958-mt7c9"] Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871032 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-config-data\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-config-data-custom\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871222 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-public-tls-certs\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871300 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-config-data-custom\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-combined-ca-bundle\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871450 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-internal-tls-certs\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871565 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-config-data\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871627 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-combined-ca-bundle\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871666 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndtf8\" (UniqueName: \"kubernetes.io/projected/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-kube-api-access-ndtf8\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871788 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-public-tls-certs\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871810 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s9fm\" (UniqueName: \"kubernetes.io/projected/ad059a3e-2246-4d06-bc00-e030379d69d5-kube-api-access-2s9fm\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.871832 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-internal-tls-certs\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973374 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-config-data-custom\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-public-tls-certs\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973491 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-config-data-custom\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-combined-ca-bundle\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973572 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-internal-tls-certs\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973647 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-config-data\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973680 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-combined-ca-bundle\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973733 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndtf8\" (UniqueName: \"kubernetes.io/projected/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-kube-api-access-ndtf8\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-public-tls-certs\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s9fm\" (UniqueName: \"kubernetes.io/projected/ad059a3e-2246-4d06-bc00-e030379d69d5-kube-api-access-2s9fm\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973842 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-internal-tls-certs\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.973884 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-config-data\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.979355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-config-data-custom\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.980364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-config-data-custom\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.984244 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-public-tls-certs\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.988094 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-config-data\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.990231 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-internal-tls-certs\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.990894 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-combined-ca-bundle\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.991325 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-combined-ca-bundle\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.992279 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-public-tls-certs\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.993191 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-internal-tls-certs\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:10.994902 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s9fm\" (UniqueName: \"kubernetes.io/projected/ad059a3e-2246-4d06-bc00-e030379d69d5-kube-api-access-2s9fm\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.002175 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad059a3e-2246-4d06-bc00-e030379d69d5-config-data\") pod \"heat-api-575d45d5d7-xbm4f\" (UID: \"ad059a3e-2246-4d06-bc00-e030379d69d5\") " pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.013868 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndtf8\" (UniqueName: \"kubernetes.io/projected/3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652-kube-api-access-ndtf8\") pod \"heat-cfnapi-7dbb656958-mt7c9\" (UID: \"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652\") " pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.098026 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.125961 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.448004 4732 generic.go:334] "Generic (PLEG): container finished" podID="e774fac5-92b3-4c5b-be38-0499157c9194" containerID="6856342d58eeb4a3a83e33da745d14fd7c1c904316481601e1edc45e8ac844c8" exitCode=1 Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.448342 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" event={"ID":"e774fac5-92b3-4c5b-be38-0499157c9194","Type":"ContainerDied","Data":"6856342d58eeb4a3a83e33da745d14fd7c1c904316481601e1edc45e8ac844c8"} Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.449185 4732 scope.go:117] "RemoveContainer" containerID="6856342d58eeb4a3a83e33da745d14fd7c1c904316481601e1edc45e8ac844c8" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.453547 4732 generic.go:334] "Generic (PLEG): container finished" podID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerID="c92277d22c48d33448e772cf4aa765432a37ef07a0053bef5b42124a3147686a" exitCode=1 Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.453603 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56b56778f9-lhkw6" event={"ID":"1d1f6e5e-2a3c-44f6-a027-76a73c54085a","Type":"ContainerDied","Data":"c92277d22c48d33448e772cf4aa765432a37ef07a0053bef5b42124a3147686a"} Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.454256 4732 scope.go:117] "RemoveContainer" containerID="c92277d22c48d33448e772cf4aa765432a37ef07a0053bef5b42124a3147686a" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.458882 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5844bd7ddb-sds75" event={"ID":"96365079-5a8a-4b2b-87f0-502a7b09ed3c","Type":"ContainerStarted","Data":"e9ddb90e060f02d62a648b61285f054cb5c271d709957b65e411fe94a5bd431b"} Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.459309 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.510196 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5844bd7ddb-sds75" podStartSLOduration=2.510170394 podStartE2EDuration="2.510170394s" podCreationTimestamp="2025-10-10 08:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:30:11.491369861 +0000 UTC m=+5938.560961102" watchObservedRunningTime="2025-10-10 08:30:11.510170394 +0000 UTC m=+5938.579761645" Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.817158 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7dbb656958-mt7c9"] Oct 10 08:30:11 crc kubenswrapper[4732]: W1010 08:30:11.821332 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ba0bd8d_c5b5_4ac5_8abf_260d7f7dc652.slice/crio-cfcf92b40bca402b609b34f83dc4cdee85e5869537234dcbd46ad95a78e698f0 WatchSource:0}: Error finding container cfcf92b40bca402b609b34f83dc4cdee85e5869537234dcbd46ad95a78e698f0: Status 404 returned error can't find the container with id cfcf92b40bca402b609b34f83dc4cdee85e5869537234dcbd46ad95a78e698f0 Oct 10 08:30:11 crc kubenswrapper[4732]: I1010 08:30:11.864795 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-575d45d5d7-xbm4f"] Oct 10 08:30:11 crc kubenswrapper[4732]: W1010 08:30:11.871822 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad059a3e_2246_4d06_bc00_e030379d69d5.slice/crio-8a61ffaad20a43f2edbe06f0301fbb527c933fc0122902ab653269c4d98c234e WatchSource:0}: Error finding container 8a61ffaad20a43f2edbe06f0301fbb527c933fc0122902ab653269c4d98c234e: Status 404 returned error can't find the container with id 8a61ffaad20a43f2edbe06f0301fbb527c933fc0122902ab653269c4d98c234e Oct 10 08:30:12 crc kubenswrapper[4732]: E1010 08:30:12.217039 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1f6e5e_2a3c_44f6_a027_76a73c54085a.slice/crio-conmon-e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode774fac5_92b3_4c5b_be38_0499157c9194.slice/crio-227f5ae4e09967ec9413a26749c523cbdfbfe78430dde1aa60d3a542a8ea2db0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1f6e5e_2a3c_44f6_a027_76a73c54085a.slice/crio-e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.430621 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.482120 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dbb656958-mt7c9" event={"ID":"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652","Type":"ContainerStarted","Data":"4053fbb4b4869b20fa7567c89d3f8f9bddf41162825a32896d59be21e4b8eb64"} Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.482429 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7dbb656958-mt7c9" event={"ID":"3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652","Type":"ContainerStarted","Data":"cfcf92b40bca402b609b34f83dc4cdee85e5869537234dcbd46ad95a78e698f0"} Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.482537 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.484889 4732 generic.go:334] "Generic (PLEG): container finished" podID="e774fac5-92b3-4c5b-be38-0499157c9194" containerID="227f5ae4e09967ec9413a26749c523cbdfbfe78430dde1aa60d3a542a8ea2db0" exitCode=1 Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.484952 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" event={"ID":"e774fac5-92b3-4c5b-be38-0499157c9194","Type":"ContainerDied","Data":"227f5ae4e09967ec9413a26749c523cbdfbfe78430dde1aa60d3a542a8ea2db0"} Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.485000 4732 scope.go:117] "RemoveContainer" containerID="6856342d58eeb4a3a83e33da745d14fd7c1c904316481601e1edc45e8ac844c8" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.485600 4732 scope.go:117] "RemoveContainer" containerID="227f5ae4e09967ec9413a26749c523cbdfbfe78430dde1aa60d3a542a8ea2db0" Oct 10 08:30:12 crc kubenswrapper[4732]: E1010 08:30:12.485893 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-cc954bc69-kbqfk_openstack(e774fac5-92b3-4c5b-be38-0499157c9194)\"" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.489371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-575d45d5d7-xbm4f" event={"ID":"ad059a3e-2246-4d06-bc00-e030379d69d5","Type":"ContainerStarted","Data":"5e3627fa8cb494f5e0d24a4fee9845af2866fb88691eb647c7db0deae8743220"} Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.489413 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-575d45d5d7-xbm4f" event={"ID":"ad059a3e-2246-4d06-bc00-e030379d69d5","Type":"ContainerStarted","Data":"8a61ffaad20a43f2edbe06f0301fbb527c933fc0122902ab653269c4d98c234e"} Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.489546 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.495060 4732 generic.go:334] "Generic (PLEG): container finished" podID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerID="e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2" exitCode=1 Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.495114 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56b56778f9-lhkw6" event={"ID":"1d1f6e5e-2a3c-44f6-a027-76a73c54085a","Type":"ContainerDied","Data":"e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2"} Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.496182 4732 scope.go:117] "RemoveContainer" containerID="e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2" Oct 10 08:30:12 crc kubenswrapper[4732]: E1010 08:30:12.496499 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56b56778f9-lhkw6_openstack(1d1f6e5e-2a3c-44f6-a027-76a73c54085a)\"" pod="openstack/heat-api-56b56778f9-lhkw6" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.500975 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7dbb656958-mt7c9" podStartSLOduration=2.500955304 podStartE2EDuration="2.500955304s" podCreationTimestamp="2025-10-10 08:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:30:12.497496902 +0000 UTC m=+5939.567088143" watchObservedRunningTime="2025-10-10 08:30:12.500955304 +0000 UTC m=+5939.570546535" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.542242 4732 scope.go:117] "RemoveContainer" containerID="c92277d22c48d33448e772cf4aa765432a37ef07a0053bef5b42124a3147686a" Oct 10 08:30:12 crc kubenswrapper[4732]: I1010 08:30:12.571652 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-575d45d5d7-xbm4f" podStartSLOduration=2.571630927 podStartE2EDuration="2.571630927s" podCreationTimestamp="2025-10-10 08:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:30:12.562146573 +0000 UTC m=+5939.631737814" watchObservedRunningTime="2025-10-10 08:30:12.571630927 +0000 UTC m=+5939.641222168" Oct 10 08:30:13 crc kubenswrapper[4732]: I1010 08:30:13.504590 4732 scope.go:117] "RemoveContainer" containerID="227f5ae4e09967ec9413a26749c523cbdfbfe78430dde1aa60d3a542a8ea2db0" Oct 10 08:30:13 crc kubenswrapper[4732]: E1010 08:30:13.504916 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-cc954bc69-kbqfk_openstack(e774fac5-92b3-4c5b-be38-0499157c9194)\"" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" Oct 10 08:30:13 crc kubenswrapper[4732]: I1010 08:30:13.507717 4732 scope.go:117] "RemoveContainer" containerID="e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2" Oct 10 08:30:13 crc kubenswrapper[4732]: E1010 08:30:13.507887 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56b56778f9-lhkw6_openstack(1d1f6e5e-2a3c-44f6-a027-76a73c54085a)\"" pod="openstack/heat-api-56b56778f9-lhkw6" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" Oct 10 08:30:14 crc kubenswrapper[4732]: I1010 08:30:14.790865 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:14 crc kubenswrapper[4732]: I1010 08:30:14.791168 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:14 crc kubenswrapper[4732]: I1010 08:30:14.791876 4732 scope.go:117] "RemoveContainer" containerID="227f5ae4e09967ec9413a26749c523cbdfbfe78430dde1aa60d3a542a8ea2db0" Oct 10 08:30:14 crc kubenswrapper[4732]: E1010 08:30:14.792084 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-cc954bc69-kbqfk_openstack(e774fac5-92b3-4c5b-be38-0499157c9194)\"" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" Oct 10 08:30:14 crc kubenswrapper[4732]: I1010 08:30:14.825259 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:14 crc kubenswrapper[4732]: I1010 08:30:14.825314 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:14 crc kubenswrapper[4732]: I1010 08:30:14.826061 4732 scope.go:117] "RemoveContainer" containerID="e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2" Oct 10 08:30:14 crc kubenswrapper[4732]: E1010 08:30:14.826303 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-56b56778f9-lhkw6_openstack(1d1f6e5e-2a3c-44f6-a027-76a73c54085a)\"" pod="openstack/heat-api-56b56778f9-lhkw6" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.115339 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" podUID="a25c6360-5e71-407f-8d5b-100bdcfd71e3" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.112:8000/healthcheck\": read tcp 10.217.0.2:53734->10.217.1.112:8000: read: connection reset by peer" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.121462 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-799877d5cf-8dwqx" podUID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.111:8004/healthcheck\": read tcp 10.217.0.2:53024->10.217.1.111:8004: read: connection reset by peer" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.300240 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5cbd6844cb-rnrwt" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.101:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.101:8443: connect: connection refused" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.300367 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.507212 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.543203 4732 generic.go:334] "Generic (PLEG): container finished" podID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" containerID="18053e77da6d524166a649a2782a3d37461020a743b795ff69e979c47bc3bb93" exitCode=0 Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.543277 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-799877d5cf-8dwqx" event={"ID":"25da90c2-6c96-45d4-9a99-ae6ba7b76927","Type":"ContainerDied","Data":"18053e77da6d524166a649a2782a3d37461020a743b795ff69e979c47bc3bb93"} Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.544302 4732 generic.go:334] "Generic (PLEG): container finished" podID="a25c6360-5e71-407f-8d5b-100bdcfd71e3" containerID="4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f" exitCode=0 Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.544334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" event={"ID":"a25c6360-5e71-407f-8d5b-100bdcfd71e3","Type":"ContainerDied","Data":"4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f"} Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.544352 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" event={"ID":"a25c6360-5e71-407f-8d5b-100bdcfd71e3","Type":"ContainerDied","Data":"eb620c3d3085aa65a8a695f27f762217a310a8a5a62886cb7cf6258ef978c238"} Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.544369 4732 scope.go:117] "RemoveContainer" containerID="4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.544516 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d7cd5f486-p94xv" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.570783 4732 scope.go:117] "RemoveContainer" containerID="4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f" Oct 10 08:30:16 crc kubenswrapper[4732]: E1010 08:30:16.571545 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f\": container with ID starting with 4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f not found: ID does not exist" containerID="4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.571574 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f"} err="failed to get container status \"4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f\": rpc error: code = NotFound desc = could not find container \"4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f\": container with ID starting with 4837d8a487f77b1fc5c4c8ed452adb6a74519f6c7fafb685dd86d6dbc2f5258f not found: ID does not exist" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.643099 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data-custom\") pod \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.644037 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-combined-ca-bundle\") pod \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.644273 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cfj4\" (UniqueName: \"kubernetes.io/projected/a25c6360-5e71-407f-8d5b-100bdcfd71e3-kube-api-access-7cfj4\") pod \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.644331 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data\") pod \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\" (UID: \"a25c6360-5e71-407f-8d5b-100bdcfd71e3\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.648018 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a25c6360-5e71-407f-8d5b-100bdcfd71e3" (UID: "a25c6360-5e71-407f-8d5b-100bdcfd71e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.648041 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25c6360-5e71-407f-8d5b-100bdcfd71e3-kube-api-access-7cfj4" (OuterVolumeSpecName: "kube-api-access-7cfj4") pod "a25c6360-5e71-407f-8d5b-100bdcfd71e3" (UID: "a25c6360-5e71-407f-8d5b-100bdcfd71e3"). InnerVolumeSpecName "kube-api-access-7cfj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.667850 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a25c6360-5e71-407f-8d5b-100bdcfd71e3" (UID: "a25c6360-5e71-407f-8d5b-100bdcfd71e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.674404 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.696377 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data" (OuterVolumeSpecName: "config-data") pod "a25c6360-5e71-407f-8d5b-100bdcfd71e3" (UID: "a25c6360-5e71-407f-8d5b-100bdcfd71e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.746644 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.746681 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.746704 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cfj4\" (UniqueName: \"kubernetes.io/projected/a25c6360-5e71-407f-8d5b-100bdcfd71e3-kube-api-access-7cfj4\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.746717 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25c6360-5e71-407f-8d5b-100bdcfd71e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.847536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6ddc\" (UniqueName: \"kubernetes.io/projected/25da90c2-6c96-45d4-9a99-ae6ba7b76927-kube-api-access-q6ddc\") pod \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.847659 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data-custom\") pod \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.847822 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data\") pod \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.847879 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-combined-ca-bundle\") pod \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\" (UID: \"25da90c2-6c96-45d4-9a99-ae6ba7b76927\") " Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.851884 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25da90c2-6c96-45d4-9a99-ae6ba7b76927" (UID: "25da90c2-6c96-45d4-9a99-ae6ba7b76927"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.852310 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25da90c2-6c96-45d4-9a99-ae6ba7b76927-kube-api-access-q6ddc" (OuterVolumeSpecName: "kube-api-access-q6ddc") pod "25da90c2-6c96-45d4-9a99-ae6ba7b76927" (UID: "25da90c2-6c96-45d4-9a99-ae6ba7b76927"). InnerVolumeSpecName "kube-api-access-q6ddc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.889126 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25da90c2-6c96-45d4-9a99-ae6ba7b76927" (UID: "25da90c2-6c96-45d4-9a99-ae6ba7b76927"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.894483 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6d7cd5f486-p94xv"] Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.901516 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data" (OuterVolumeSpecName: "config-data") pod "25da90c2-6c96-45d4-9a99-ae6ba7b76927" (UID: "25da90c2-6c96-45d4-9a99-ae6ba7b76927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.903955 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6d7cd5f486-p94xv"] Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.950955 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.950979 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.950989 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6ddc\" (UniqueName: \"kubernetes.io/projected/25da90c2-6c96-45d4-9a99-ae6ba7b76927-kube-api-access-q6ddc\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:16 crc kubenswrapper[4732]: I1010 08:30:16.950998 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25da90c2-6c96-45d4-9a99-ae6ba7b76927-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:17 crc kubenswrapper[4732]: I1010 08:30:17.556535 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-799877d5cf-8dwqx" Oct 10 08:30:17 crc kubenswrapper[4732]: I1010 08:30:17.556604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-799877d5cf-8dwqx" event={"ID":"25da90c2-6c96-45d4-9a99-ae6ba7b76927","Type":"ContainerDied","Data":"d6b9a3f0acde60e4b1b9162f36b92a2016550fabb709f6e074feb1d6a8b0f5cf"} Oct 10 08:30:17 crc kubenswrapper[4732]: I1010 08:30:17.556762 4732 scope.go:117] "RemoveContainer" containerID="18053e77da6d524166a649a2782a3d37461020a743b795ff69e979c47bc3bb93" Oct 10 08:30:17 crc kubenswrapper[4732]: I1010 08:30:17.608138 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-799877d5cf-8dwqx"] Oct 10 08:30:17 crc kubenswrapper[4732]: I1010 08:30:17.616440 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-799877d5cf-8dwqx"] Oct 10 08:30:17 crc kubenswrapper[4732]: I1010 08:30:17.679451 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" path="/var/lib/kubelet/pods/25da90c2-6c96-45d4-9a99-ae6ba7b76927/volumes" Oct 10 08:30:17 crc kubenswrapper[4732]: I1010 08:30:17.680166 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25c6360-5e71-407f-8d5b-100bdcfd71e3" path="/var/lib/kubelet/pods/a25c6360-5e71-407f-8d5b-100bdcfd71e3/volumes" Oct 10 08:30:22 crc kubenswrapper[4732]: I1010 08:30:22.504896 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-575d45d5d7-xbm4f" Oct 10 08:30:22 crc kubenswrapper[4732]: I1010 08:30:22.618915 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56b56778f9-lhkw6"] Oct 10 08:30:22 crc kubenswrapper[4732]: I1010 08:30:22.625635 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:22 crc kubenswrapper[4732]: I1010 08:30:22.646591 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7dbb656958-mt7c9" Oct 10 08:30:22 crc kubenswrapper[4732]: I1010 08:30:22.700806 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-cc954bc69-kbqfk"] Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.096671 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.103894 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.179321 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctqtq\" (UniqueName: \"kubernetes.io/projected/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-kube-api-access-ctqtq\") pod \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.179959 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data\") pod \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.180127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-combined-ca-bundle\") pod \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.180260 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data-custom\") pod \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\" (UID: \"1d1f6e5e-2a3c-44f6-a027-76a73c54085a\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.187051 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d1f6e5e-2a3c-44f6-a027-76a73c54085a" (UID: "1d1f6e5e-2a3c-44f6-a027-76a73c54085a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.195932 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-kube-api-access-ctqtq" (OuterVolumeSpecName: "kube-api-access-ctqtq") pod "1d1f6e5e-2a3c-44f6-a027-76a73c54085a" (UID: "1d1f6e5e-2a3c-44f6-a027-76a73c54085a"). InnerVolumeSpecName "kube-api-access-ctqtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.231939 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d1f6e5e-2a3c-44f6-a027-76a73c54085a" (UID: "1d1f6e5e-2a3c-44f6-a027-76a73c54085a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.243489 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data" (OuterVolumeSpecName: "config-data") pod "1d1f6e5e-2a3c-44f6-a027-76a73c54085a" (UID: "1d1f6e5e-2a3c-44f6-a027-76a73c54085a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.281845 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data\") pod \"e774fac5-92b3-4c5b-be38-0499157c9194\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.281906 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgbwt\" (UniqueName: \"kubernetes.io/projected/e774fac5-92b3-4c5b-be38-0499157c9194-kube-api-access-jgbwt\") pod \"e774fac5-92b3-4c5b-be38-0499157c9194\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.281948 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-combined-ca-bundle\") pod \"e774fac5-92b3-4c5b-be38-0499157c9194\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.282127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data-custom\") pod \"e774fac5-92b3-4c5b-be38-0499157c9194\" (UID: \"e774fac5-92b3-4c5b-be38-0499157c9194\") " Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.282528 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctqtq\" (UniqueName: \"kubernetes.io/projected/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-kube-api-access-ctqtq\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.282545 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.282554 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.282564 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d1f6e5e-2a3c-44f6-a027-76a73c54085a-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.285938 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e774fac5-92b3-4c5b-be38-0499157c9194-kube-api-access-jgbwt" (OuterVolumeSpecName: "kube-api-access-jgbwt") pod "e774fac5-92b3-4c5b-be38-0499157c9194" (UID: "e774fac5-92b3-4c5b-be38-0499157c9194"). InnerVolumeSpecName "kube-api-access-jgbwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.286307 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e774fac5-92b3-4c5b-be38-0499157c9194" (UID: "e774fac5-92b3-4c5b-be38-0499157c9194"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.310051 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e774fac5-92b3-4c5b-be38-0499157c9194" (UID: "e774fac5-92b3-4c5b-be38-0499157c9194"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.333207 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data" (OuterVolumeSpecName: "config-data") pod "e774fac5-92b3-4c5b-be38-0499157c9194" (UID: "e774fac5-92b3-4c5b-be38-0499157c9194"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.384332 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.384366 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgbwt\" (UniqueName: \"kubernetes.io/projected/e774fac5-92b3-4c5b-be38-0499157c9194-kube-api-access-jgbwt\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.384379 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.384423 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e774fac5-92b3-4c5b-be38-0499157c9194-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.646020 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" event={"ID":"e774fac5-92b3-4c5b-be38-0499157c9194","Type":"ContainerDied","Data":"b63ac20417728eb29d8f5bd785b0613ce172467b0157cb6dbc9ed77ba86152af"} Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.646056 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-cc954bc69-kbqfk" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.646078 4732 scope.go:117] "RemoveContainer" containerID="227f5ae4e09967ec9413a26749c523cbdfbfe78430dde1aa60d3a542a8ea2db0" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.647219 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-56b56778f9-lhkw6" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.647225 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-56b56778f9-lhkw6" event={"ID":"1d1f6e5e-2a3c-44f6-a027-76a73c54085a","Type":"ContainerDied","Data":"50585a3d2d9211e04c9b5c39466460996d1d7260529cf1aa18ea83ab3b692202"} Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.675412 4732 scope.go:117] "RemoveContainer" containerID="e29e13f99c859f74e7cf3a9737cd03a21b5b247e4efb5a0e1bf538acbd4a95f2" Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.693800 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-56b56778f9-lhkw6"] Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.702543 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-56b56778f9-lhkw6"] Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.715802 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-cc954bc69-kbqfk"] Oct 10 08:30:23 crc kubenswrapper[4732]: I1010 08:30:23.726312 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-cc954bc69-kbqfk"] Oct 10 08:30:25 crc kubenswrapper[4732]: I1010 08:30:25.671464 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" path="/var/lib/kubelet/pods/1d1f6e5e-2a3c-44f6-a027-76a73c54085a/volumes" Oct 10 08:30:25 crc kubenswrapper[4732]: I1010 08:30:25.672535 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" path="/var/lib/kubelet/pods/e774fac5-92b3-4c5b-be38-0499157c9194/volumes" Oct 10 08:30:25 crc kubenswrapper[4732]: I1010 08:30:25.673601 4732 generic.go:334] "Generic (PLEG): container finished" podID="f934d435-c311-497b-9298-43a3f48e717f" containerID="a39a3119019264e0284f247dfa899bd69435ff229af03952c369ca4a788c539e" exitCode=137 Oct 10 08:30:25 crc kubenswrapper[4732]: I1010 08:30:25.673633 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbd6844cb-rnrwt" event={"ID":"f934d435-c311-497b-9298-43a3f48e717f","Type":"ContainerDied","Data":"a39a3119019264e0284f247dfa899bd69435ff229af03952c369ca4a788c539e"} Oct 10 08:30:25 crc kubenswrapper[4732]: I1010 08:30:25.911099 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030448 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-config-data\") pod \"f934d435-c311-497b-9298-43a3f48e717f\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030571 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934d435-c311-497b-9298-43a3f48e717f-logs\") pod \"f934d435-c311-497b-9298-43a3f48e717f\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030621 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk5r5\" (UniqueName: \"kubernetes.io/projected/f934d435-c311-497b-9298-43a3f48e717f-kube-api-access-wk5r5\") pod \"f934d435-c311-497b-9298-43a3f48e717f\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030717 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-secret-key\") pod \"f934d435-c311-497b-9298-43a3f48e717f\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-combined-ca-bundle\") pod \"f934d435-c311-497b-9298-43a3f48e717f\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030772 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-scripts\") pod \"f934d435-c311-497b-9298-43a3f48e717f\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030832 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-tls-certs\") pod \"f934d435-c311-497b-9298-43a3f48e717f\" (UID: \"f934d435-c311-497b-9298-43a3f48e717f\") " Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.030948 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f934d435-c311-497b-9298-43a3f48e717f-logs" (OuterVolumeSpecName: "logs") pod "f934d435-c311-497b-9298-43a3f48e717f" (UID: "f934d435-c311-497b-9298-43a3f48e717f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.031456 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f934d435-c311-497b-9298-43a3f48e717f-logs\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.036012 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f934d435-c311-497b-9298-43a3f48e717f-kube-api-access-wk5r5" (OuterVolumeSpecName: "kube-api-access-wk5r5") pod "f934d435-c311-497b-9298-43a3f48e717f" (UID: "f934d435-c311-497b-9298-43a3f48e717f"). InnerVolumeSpecName "kube-api-access-wk5r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.036314 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f934d435-c311-497b-9298-43a3f48e717f" (UID: "f934d435-c311-497b-9298-43a3f48e717f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.056852 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-config-data" (OuterVolumeSpecName: "config-data") pod "f934d435-c311-497b-9298-43a3f48e717f" (UID: "f934d435-c311-497b-9298-43a3f48e717f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.058193 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-scripts" (OuterVolumeSpecName: "scripts") pod "f934d435-c311-497b-9298-43a3f48e717f" (UID: "f934d435-c311-497b-9298-43a3f48e717f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.060668 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f934d435-c311-497b-9298-43a3f48e717f" (UID: "f934d435-c311-497b-9298-43a3f48e717f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.086058 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f934d435-c311-497b-9298-43a3f48e717f" (UID: "f934d435-c311-497b-9298-43a3f48e717f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.134182 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk5r5\" (UniqueName: \"kubernetes.io/projected/f934d435-c311-497b-9298-43a3f48e717f-kube-api-access-wk5r5\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.134214 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.134224 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.134232 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.134242 4732 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f934d435-c311-497b-9298-43a3f48e717f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.134251 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f934d435-c311-497b-9298-43a3f48e717f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.683889 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cbd6844cb-rnrwt" event={"ID":"f934d435-c311-497b-9298-43a3f48e717f","Type":"ContainerDied","Data":"e15d3ed3dd4397268106ff22be226950a45da2f33a9fb897bc46d5d4b0f278d2"} Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.683945 4732 scope.go:117] "RemoveContainer" containerID="52ea17f600f69a171aaaaa793ba0984c355be42198e0ebdf848e2a5cefd69ab6" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.684097 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cbd6844cb-rnrwt" Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.736531 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cbd6844cb-rnrwt"] Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.744668 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cbd6844cb-rnrwt"] Oct 10 08:30:26 crc kubenswrapper[4732]: I1010 08:30:26.966119 4732 scope.go:117] "RemoveContainer" containerID="a39a3119019264e0284f247dfa899bd69435ff229af03952c369ca4a788c539e" Oct 10 08:30:27 crc kubenswrapper[4732]: I1010 08:30:27.671780 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f934d435-c311-497b-9298-43a3f48e717f" path="/var/lib/kubelet/pods/f934d435-c311-497b-9298-43a3f48e717f/volumes" Oct 10 08:30:29 crc kubenswrapper[4732]: I1010 08:30:29.792737 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5844bd7ddb-sds75" Oct 10 08:30:29 crc kubenswrapper[4732]: I1010 08:30:29.836063 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-d6b8bdf8b-bkd26"] Oct 10 08:30:29 crc kubenswrapper[4732]: I1010 08:30:29.836374 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-d6b8bdf8b-bkd26" podUID="922f5d41-7b29-4466-a199-00ac9ff5a424" containerName="heat-engine" containerID="cri-o://df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2" gracePeriod=60 Oct 10 08:30:32 crc kubenswrapper[4732]: E1010 08:30:32.578832 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 10 08:30:32 crc kubenswrapper[4732]: E1010 08:30:32.582037 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 10 08:30:32 crc kubenswrapper[4732]: E1010 08:30:32.584552 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Oct 10 08:30:32 crc kubenswrapper[4732]: E1010 08:30:32.584601 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-d6b8bdf8b-bkd26" podUID="922f5d41-7b29-4466-a199-00ac9ff5a424" containerName="heat-engine" Oct 10 08:30:34 crc kubenswrapper[4732]: I1010 08:30:34.817545 4732 scope.go:117] "RemoveContainer" containerID="98e77f1ca7410be7322702551a539b55acb07180737af5cf28ffd0a5b9abe43b" Oct 10 08:30:34 crc kubenswrapper[4732]: I1010 08:30:34.861596 4732 scope.go:117] "RemoveContainer" containerID="6b4a0afd3b0af80b544e00e777b7a87827376c177ac04897493c68162479b41f" Oct 10 08:30:34 crc kubenswrapper[4732]: I1010 08:30:34.884461 4732 scope.go:117] "RemoveContainer" containerID="050d265c2e176646e010776b89fb4b86e150efa09692e4d759afefbf99b908d9" Oct 10 08:30:34 crc kubenswrapper[4732]: I1010 08:30:34.928665 4732 scope.go:117] "RemoveContainer" containerID="933636b52d682f8679c3a3929f2a945ad2993ab5044f7c2e8fabfeffabe91111" Oct 10 08:30:41 crc kubenswrapper[4732]: I1010 08:30:41.831305 4732 generic.go:334] "Generic (PLEG): container finished" podID="922f5d41-7b29-4466-a199-00ac9ff5a424" containerID="df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2" exitCode=0 Oct 10 08:30:41 crc kubenswrapper[4732]: I1010 08:30:41.831427 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d6b8bdf8b-bkd26" event={"ID":"922f5d41-7b29-4466-a199-00ac9ff5a424","Type":"ContainerDied","Data":"df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2"} Oct 10 08:30:41 crc kubenswrapper[4732]: I1010 08:30:41.956606 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.079530 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-combined-ca-bundle\") pod \"922f5d41-7b29-4466-a199-00ac9ff5a424\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.079595 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data-custom\") pod \"922f5d41-7b29-4466-a199-00ac9ff5a424\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.079824 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2ftp\" (UniqueName: \"kubernetes.io/projected/922f5d41-7b29-4466-a199-00ac9ff5a424-kube-api-access-j2ftp\") pod \"922f5d41-7b29-4466-a199-00ac9ff5a424\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.080099 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data\") pod \"922f5d41-7b29-4466-a199-00ac9ff5a424\" (UID: \"922f5d41-7b29-4466-a199-00ac9ff5a424\") " Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.085634 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "922f5d41-7b29-4466-a199-00ac9ff5a424" (UID: "922f5d41-7b29-4466-a199-00ac9ff5a424"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.101264 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/922f5d41-7b29-4466-a199-00ac9ff5a424-kube-api-access-j2ftp" (OuterVolumeSpecName: "kube-api-access-j2ftp") pod "922f5d41-7b29-4466-a199-00ac9ff5a424" (UID: "922f5d41-7b29-4466-a199-00ac9ff5a424"). InnerVolumeSpecName "kube-api-access-j2ftp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.114603 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "922f5d41-7b29-4466-a199-00ac9ff5a424" (UID: "922f5d41-7b29-4466-a199-00ac9ff5a424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.170050 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data" (OuterVolumeSpecName: "config-data") pod "922f5d41-7b29-4466-a199-00ac9ff5a424" (UID: "922f5d41-7b29-4466-a199-00ac9ff5a424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.183731 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2ftp\" (UniqueName: \"kubernetes.io/projected/922f5d41-7b29-4466-a199-00ac9ff5a424-kube-api-access-j2ftp\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.183821 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.183834 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.183846 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/922f5d41-7b29-4466-a199-00ac9ff5a424-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.843938 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-d6b8bdf8b-bkd26" event={"ID":"922f5d41-7b29-4466-a199-00ac9ff5a424","Type":"ContainerDied","Data":"e5ca578ca47b4cf1daf5ba86389511463716ca4eff6493f76052f5d12acef155"} Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.844298 4732 scope.go:117] "RemoveContainer" containerID="df86eb82961f51349496640d2d72a99ee76533ae6793816a7e74ce6faa9eadd2" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.844094 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-d6b8bdf8b-bkd26" Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.912939 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-d6b8bdf8b-bkd26"] Oct 10 08:30:42 crc kubenswrapper[4732]: I1010 08:30:42.922330 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-d6b8bdf8b-bkd26"] Oct 10 08:30:43 crc kubenswrapper[4732]: E1010 08:30:43.029259 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod922f5d41_7b29_4466_a199_00ac9ff5a424.slice/crio-e5ca578ca47b4cf1daf5ba86389511463716ca4eff6493f76052f5d12acef155\": RecentStats: unable to find data in memory cache]" Oct 10 08:30:43 crc kubenswrapper[4732]: I1010 08:30:43.690486 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="922f5d41-7b29-4466-a199-00ac9ff5a424" path="/var/lib/kubelet/pods/922f5d41-7b29-4466-a199-00ac9ff5a424/volumes" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.625129 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq"] Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.625892 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon-log" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.625907 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon-log" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.625921 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.625928 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.625944 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.625952 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.625971 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.625978 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.625999 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626006 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.626023 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="922f5d41-7b29-4466-a199-00ac9ff5a424" containerName="heat-engine" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626033 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="922f5d41-7b29-4466-a199-00ac9ff5a424" containerName="heat-engine" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.626044 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25c6360-5e71-407f-8d5b-100bdcfd71e3" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626052 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25c6360-5e71-407f-8d5b-100bdcfd71e3" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.626074 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626083 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626314 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626333 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626347 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="922f5d41-7b29-4466-a199-00ac9ff5a424" containerName="heat-engine" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626363 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626373 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1f6e5e-2a3c-44f6-a027-76a73c54085a" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626386 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626398 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25c6360-5e71-407f-8d5b-100bdcfd71e3" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626409 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="25da90c2-6c96-45d4-9a99-ae6ba7b76927" containerName="heat-api" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626420 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f934d435-c311-497b-9298-43a3f48e717f" containerName="horizon-log" Oct 10 08:30:47 crc kubenswrapper[4732]: E1010 08:30:47.626597 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.626606 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e774fac5-92b3-4c5b-be38-0499157c9194" containerName="heat-cfnapi" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.628064 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.633280 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.637750 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq"] Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.795029 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxvvk\" (UniqueName: \"kubernetes.io/projected/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-kube-api-access-dxvvk\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.795739 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.796047 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.899200 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxvvk\" (UniqueName: \"kubernetes.io/projected/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-kube-api-access-dxvvk\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.899289 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.899317 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.899913 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.900015 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.918016 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxvvk\" (UniqueName: \"kubernetes.io/projected/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-kube-api-access-dxvvk\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:47 crc kubenswrapper[4732]: I1010 08:30:47.954650 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:48 crc kubenswrapper[4732]: I1010 08:30:48.434143 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq"] Oct 10 08:30:48 crc kubenswrapper[4732]: I1010 08:30:48.906659 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" event={"ID":"56a273c6-4f36-48f0-83df-2bbe1b6fe6df","Type":"ContainerStarted","Data":"e40571dcc8edb07e964bcd02d2b9db734cb5dfc608fa68d7a9e917080ab0fc94"} Oct 10 08:30:48 crc kubenswrapper[4732]: I1010 08:30:48.907067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" event={"ID":"56a273c6-4f36-48f0-83df-2bbe1b6fe6df","Type":"ContainerStarted","Data":"7734fd0dce837177cc3abcf00c559c19a6a9e3b17d04d0b626ea014883c8a9e1"} Oct 10 08:30:49 crc kubenswrapper[4732]: I1010 08:30:49.918734 4732 generic.go:334] "Generic (PLEG): container finished" podID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerID="e40571dcc8edb07e964bcd02d2b9db734cb5dfc608fa68d7a9e917080ab0fc94" exitCode=0 Oct 10 08:30:49 crc kubenswrapper[4732]: I1010 08:30:49.918818 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" event={"ID":"56a273c6-4f36-48f0-83df-2bbe1b6fe6df","Type":"ContainerDied","Data":"e40571dcc8edb07e964bcd02d2b9db734cb5dfc608fa68d7a9e917080ab0fc94"} Oct 10 08:30:51 crc kubenswrapper[4732]: I1010 08:30:51.950882 4732 generic.go:334] "Generic (PLEG): container finished" podID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerID="d1888dffb9fda55d975a94764db049cb31dc7794a271d52a9e5b06189c6bc64f" exitCode=0 Oct 10 08:30:51 crc kubenswrapper[4732]: I1010 08:30:51.950967 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" event={"ID":"56a273c6-4f36-48f0-83df-2bbe1b6fe6df","Type":"ContainerDied","Data":"d1888dffb9fda55d975a94764db049cb31dc7794a271d52a9e5b06189c6bc64f"} Oct 10 08:30:52 crc kubenswrapper[4732]: I1010 08:30:52.963214 4732 generic.go:334] "Generic (PLEG): container finished" podID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerID="e475f3edb4f476e82726f233a6a6a3ce15e0be8cec2c358ea30c87a3f593f006" exitCode=0 Oct 10 08:30:52 crc kubenswrapper[4732]: I1010 08:30:52.963296 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" event={"ID":"56a273c6-4f36-48f0-83df-2bbe1b6fe6df","Type":"ContainerDied","Data":"e475f3edb4f476e82726f233a6a6a3ce15e0be8cec2c358ea30c87a3f593f006"} Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.325260 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.428767 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-util\") pod \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.428910 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxvvk\" (UniqueName: \"kubernetes.io/projected/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-kube-api-access-dxvvk\") pod \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.429012 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-bundle\") pod \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\" (UID: \"56a273c6-4f36-48f0-83df-2bbe1b6fe6df\") " Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.431082 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-bundle" (OuterVolumeSpecName: "bundle") pod "56a273c6-4f36-48f0-83df-2bbe1b6fe6df" (UID: "56a273c6-4f36-48f0-83df-2bbe1b6fe6df"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.435887 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-kube-api-access-dxvvk" (OuterVolumeSpecName: "kube-api-access-dxvvk") pod "56a273c6-4f36-48f0-83df-2bbe1b6fe6df" (UID: "56a273c6-4f36-48f0-83df-2bbe1b6fe6df"). InnerVolumeSpecName "kube-api-access-dxvvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.440876 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-util" (OuterVolumeSpecName: "util") pod "56a273c6-4f36-48f0-83df-2bbe1b6fe6df" (UID: "56a273c6-4f36-48f0-83df-2bbe1b6fe6df"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.532353 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-util\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.532429 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxvvk\" (UniqueName: \"kubernetes.io/projected/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-kube-api-access-dxvvk\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.532446 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/56a273c6-4f36-48f0-83df-2bbe1b6fe6df-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.981165 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" event={"ID":"56a273c6-4f36-48f0-83df-2bbe1b6fe6df","Type":"ContainerDied","Data":"7734fd0dce837177cc3abcf00c559c19a6a9e3b17d04d0b626ea014883c8a9e1"} Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.981216 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7734fd0dce837177cc3abcf00c559c19a6a9e3b17d04d0b626ea014883c8a9e1" Oct 10 08:30:54 crc kubenswrapper[4732]: I1010 08:30:54.981276 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq" Oct 10 08:31:00 crc kubenswrapper[4732]: I1010 08:31:00.055830 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4jc2k"] Oct 10 08:31:00 crc kubenswrapper[4732]: I1010 08:31:00.064586 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4jc2k"] Oct 10 08:31:01 crc kubenswrapper[4732]: I1010 08:31:01.671174 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a51b14-b879-47b4-9fae-efa0ec35db2f" path="/var/lib/kubelet/pods/d6a51b14-b879-47b4-9fae-efa0ec35db2f/volumes" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.488372 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54"] Oct 10 08:31:05 crc kubenswrapper[4732]: E1010 08:31:05.489274 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerName="pull" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.489286 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerName="pull" Oct 10 08:31:05 crc kubenswrapper[4732]: E1010 08:31:05.489302 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerName="util" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.489308 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerName="util" Oct 10 08:31:05 crc kubenswrapper[4732]: E1010 08:31:05.489316 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerName="extract" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.489324 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerName="extract" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.489530 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56a273c6-4f36-48f0-83df-2bbe1b6fe6df" containerName="extract" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.490199 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.492575 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-wz9nr" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.492720 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.495503 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.504212 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.538020 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.539459 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.542095 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.547705 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.548869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.549483 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-x7rdd" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.582817 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.642652 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.675068 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf\" (UID: \"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.675122 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96c3930e-4172-47a4-85fa-d421a85fa25f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-gd956\" (UID: \"96c3930e-4172-47a4-85fa-d421a85fa25f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.675164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96c3930e-4172-47a4-85fa-d421a85fa25f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-gd956\" (UID: \"96c3930e-4172-47a4-85fa-d421a85fa25f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.675194 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k6jr\" (UniqueName: \"kubernetes.io/projected/b83736e0-1c5a-4160-a56a-eaaa894e994d-kube-api-access-8k6jr\") pod \"obo-prometheus-operator-7c8cf85677-2bz54\" (UID: \"b83736e0-1c5a-4160-a56a-eaaa894e994d\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.675257 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf\" (UID: \"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.728884 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-tds2n"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.730490 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.733394 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-wsdf6" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.733617 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.738976 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-tds2n"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.778771 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf\" (UID: \"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.778863 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96c3930e-4172-47a4-85fa-d421a85fa25f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-gd956\" (UID: \"96c3930e-4172-47a4-85fa-d421a85fa25f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.778918 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96c3930e-4172-47a4-85fa-d421a85fa25f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-gd956\" (UID: \"96c3930e-4172-47a4-85fa-d421a85fa25f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.778952 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k6jr\" (UniqueName: \"kubernetes.io/projected/b83736e0-1c5a-4160-a56a-eaaa894e994d-kube-api-access-8k6jr\") pod \"obo-prometheus-operator-7c8cf85677-2bz54\" (UID: \"b83736e0-1c5a-4160-a56a-eaaa894e994d\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.779027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf\" (UID: \"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.795320 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/96c3930e-4172-47a4-85fa-d421a85fa25f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-gd956\" (UID: \"96c3930e-4172-47a4-85fa-d421a85fa25f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.795408 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf\" (UID: \"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.796406 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf\" (UID: \"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.801114 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/96c3930e-4172-47a4-85fa-d421a85fa25f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54c4c6765f-gd956\" (UID: \"96c3930e-4172-47a4-85fa-d421a85fa25f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.814430 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k6jr\" (UniqueName: \"kubernetes.io/projected/b83736e0-1c5a-4160-a56a-eaaa894e994d-kube-api-access-8k6jr\") pod \"obo-prometheus-operator-7c8cf85677-2bz54\" (UID: \"b83736e0-1c5a-4160-a56a-eaaa894e994d\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.843985 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-cwnj8"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.845438 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.848992 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zd2sn" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.873769 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-cwnj8"] Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.880246 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvrw\" (UniqueName: \"kubernetes.io/projected/3e0db6e5-6938-46a4-91c7-9eb584d07a5d-kube-api-access-xfvrw\") pod \"observability-operator-cc5f78dfc-tds2n\" (UID: \"3e0db6e5-6938-46a4-91c7-9eb584d07a5d\") " pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.880300 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e0db6e5-6938-46a4-91c7-9eb584d07a5d-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-tds2n\" (UID: \"3e0db6e5-6938-46a4-91c7-9eb584d07a5d\") " pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.923150 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.937098 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.983205 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvrw\" (UniqueName: \"kubernetes.io/projected/3e0db6e5-6938-46a4-91c7-9eb584d07a5d-kube-api-access-xfvrw\") pod \"observability-operator-cc5f78dfc-tds2n\" (UID: \"3e0db6e5-6938-46a4-91c7-9eb584d07a5d\") " pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.983270 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2gbk\" (UniqueName: \"kubernetes.io/projected/2f3c5b4f-3be5-47c7-a2af-3e85122d303b-kube-api-access-v2gbk\") pod \"perses-operator-54bc95c9fb-cwnj8\" (UID: \"2f3c5b4f-3be5-47c7-a2af-3e85122d303b\") " pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.983580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e0db6e5-6938-46a4-91c7-9eb584d07a5d-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-tds2n\" (UID: \"3e0db6e5-6938-46a4-91c7-9eb584d07a5d\") " pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.983636 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f3c5b4f-3be5-47c7-a2af-3e85122d303b-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-cwnj8\" (UID: \"2f3c5b4f-3be5-47c7-a2af-3e85122d303b\") " pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:05 crc kubenswrapper[4732]: I1010 08:31:05.988479 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e0db6e5-6938-46a4-91c7-9eb584d07a5d-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-tds2n\" (UID: \"3e0db6e5-6938-46a4-91c7-9eb584d07a5d\") " pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.015478 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvrw\" (UniqueName: \"kubernetes.io/projected/3e0db6e5-6938-46a4-91c7-9eb584d07a5d-kube-api-access-xfvrw\") pod \"observability-operator-cc5f78dfc-tds2n\" (UID: \"3e0db6e5-6938-46a4-91c7-9eb584d07a5d\") " pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.064237 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.085849 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f3c5b4f-3be5-47c7-a2af-3e85122d303b-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-cwnj8\" (UID: \"2f3c5b4f-3be5-47c7-a2af-3e85122d303b\") " pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.085983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2gbk\" (UniqueName: \"kubernetes.io/projected/2f3c5b4f-3be5-47c7-a2af-3e85122d303b-kube-api-access-v2gbk\") pod \"perses-operator-54bc95c9fb-cwnj8\" (UID: \"2f3c5b4f-3be5-47c7-a2af-3e85122d303b\") " pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.086828 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2f3c5b4f-3be5-47c7-a2af-3e85122d303b-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-cwnj8\" (UID: \"2f3c5b4f-3be5-47c7-a2af-3e85122d303b\") " pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.114926 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.125574 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2gbk\" (UniqueName: \"kubernetes.io/projected/2f3c5b4f-3be5-47c7-a2af-3e85122d303b-kube-api-access-v2gbk\") pod \"perses-operator-54bc95c9fb-cwnj8\" (UID: \"2f3c5b4f-3be5-47c7-a2af-3e85122d303b\") " pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.216979 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.719888 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956"] Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.726234 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.895848 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ksrn5"] Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.898774 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.949462 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksrn5"] Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.984162 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-tds2n"] Oct 10 08:31:06 crc kubenswrapper[4732]: I1010 08:31:06.998177 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf"] Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.014522 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-utilities\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.014606 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-catalog-content\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.014846 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bs4k\" (UniqueName: \"kubernetes.io/projected/31624964-5c28-4579-93f4-4536eb3f2671-kube-api-access-7bs4k\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.070321 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54"] Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.117282 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bs4k\" (UniqueName: \"kubernetes.io/projected/31624964-5c28-4579-93f4-4536eb3f2671-kube-api-access-7bs4k\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.117446 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-utilities\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.117505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-catalog-content\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.117937 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-utilities\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.118104 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-catalog-content\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.127559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" event={"ID":"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23","Type":"ContainerStarted","Data":"5dc86e9b08583970e3b2e74501b2f9b4a563cadb53fb5d667c4612b44a4c4a91"} Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.135816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" event={"ID":"3e0db6e5-6938-46a4-91c7-9eb584d07a5d","Type":"ContainerStarted","Data":"8209d33242b935b0bb6be33a2886b93d4e4d6372d26a21116f8d459ce6dd6208"} Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.138929 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" event={"ID":"96c3930e-4172-47a4-85fa-d421a85fa25f","Type":"ContainerStarted","Data":"3a084bd56f74557cfc66763fcbe8eec0059ce8f22d230780ebf9395572266971"} Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.141264 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-cwnj8"] Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.141408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" event={"ID":"b83736e0-1c5a-4160-a56a-eaaa894e994d","Type":"ContainerStarted","Data":"3459fafe320ac52e9b25d69b59aca26f2a3261a9d7387be5a8c9acc77f61a025"} Oct 10 08:31:07 crc kubenswrapper[4732]: W1010 08:31:07.147590 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f3c5b4f_3be5_47c7_a2af_3e85122d303b.slice/crio-effc5ee7bfef72cbd93794f3350994dd4a462c737c4b3b39f77f54a4d71b9872 WatchSource:0}: Error finding container effc5ee7bfef72cbd93794f3350994dd4a462c737c4b3b39f77f54a4d71b9872: Status 404 returned error can't find the container with id effc5ee7bfef72cbd93794f3350994dd4a462c737c4b3b39f77f54a4d71b9872 Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.155925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bs4k\" (UniqueName: \"kubernetes.io/projected/31624964-5c28-4579-93f4-4536eb3f2671-kube-api-access-7bs4k\") pod \"redhat-marketplace-ksrn5\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.250782 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:07 crc kubenswrapper[4732]: I1010 08:31:07.989787 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksrn5"] Oct 10 08:31:08 crc kubenswrapper[4732]: I1010 08:31:08.156131 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksrn5" event={"ID":"31624964-5c28-4579-93f4-4536eb3f2671","Type":"ContainerStarted","Data":"cba9748297ba654142f5a8122121ca6e85d62f42f6d9e9d73b2da508b04a3e78"} Oct 10 08:31:08 crc kubenswrapper[4732]: I1010 08:31:08.158051 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" event={"ID":"2f3c5b4f-3be5-47c7-a2af-3e85122d303b","Type":"ContainerStarted","Data":"effc5ee7bfef72cbd93794f3350994dd4a462c737c4b3b39f77f54a4d71b9872"} Oct 10 08:31:09 crc kubenswrapper[4732]: I1010 08:31:09.172123 4732 generic.go:334] "Generic (PLEG): container finished" podID="31624964-5c28-4579-93f4-4536eb3f2671" containerID="2bf76e2c58b875473328fc3477fa1c6125d874c2543e982e0892979fb9f7f668" exitCode=0 Oct 10 08:31:09 crc kubenswrapper[4732]: I1010 08:31:09.172296 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksrn5" event={"ID":"31624964-5c28-4579-93f4-4536eb3f2671","Type":"ContainerDied","Data":"2bf76e2c58b875473328fc3477fa1c6125d874c2543e982e0892979fb9f7f668"} Oct 10 08:31:10 crc kubenswrapper[4732]: I1010 08:31:10.058832 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e330-account-create-p9ll6"] Oct 10 08:31:10 crc kubenswrapper[4732]: I1010 08:31:10.077023 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e330-account-create-p9ll6"] Oct 10 08:31:10 crc kubenswrapper[4732]: I1010 08:31:10.192229 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" event={"ID":"43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23","Type":"ContainerStarted","Data":"cf4bfdd1d7641fa0c963e2e9d453b10abeb9e2e0ed614aed0ba1eabddf1b55f6"} Oct 10 08:31:10 crc kubenswrapper[4732]: I1010 08:31:10.202681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" event={"ID":"96c3930e-4172-47a4-85fa-d421a85fa25f","Type":"ContainerStarted","Data":"bdfaef3019c590da955b36d3a6598108c1ca76bc2eca0e6a389084c830c9bf1b"} Oct 10 08:31:10 crc kubenswrapper[4732]: I1010 08:31:10.221007 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf" podStartSLOduration=3.345873856 podStartE2EDuration="5.220987838s" podCreationTimestamp="2025-10-10 08:31:05 +0000 UTC" firstStartedPulling="2025-10-10 08:31:07.029097406 +0000 UTC m=+5994.098688647" lastFinishedPulling="2025-10-10 08:31:08.904211378 +0000 UTC m=+5995.973802629" observedRunningTime="2025-10-10 08:31:10.216267111 +0000 UTC m=+5997.285858372" watchObservedRunningTime="2025-10-10 08:31:10.220987838 +0000 UTC m=+5997.290579079" Oct 10 08:31:10 crc kubenswrapper[4732]: I1010 08:31:10.252627 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54c4c6765f-gd956" podStartSLOduration=3.074443498 podStartE2EDuration="5.252608935s" podCreationTimestamp="2025-10-10 08:31:05 +0000 UTC" firstStartedPulling="2025-10-10 08:31:06.725895547 +0000 UTC m=+5993.795486788" lastFinishedPulling="2025-10-10 08:31:08.904060984 +0000 UTC m=+5995.973652225" observedRunningTime="2025-10-10 08:31:10.234925131 +0000 UTC m=+5997.304516392" watchObservedRunningTime="2025-10-10 08:31:10.252608935 +0000 UTC m=+5997.322200176" Oct 10 08:31:11 crc kubenswrapper[4732]: I1010 08:31:11.676440 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52aa122-1110-42ef-90b9-2fff969a770b" path="/var/lib/kubelet/pods/a52aa122-1110-42ef-90b9-2fff969a770b/volumes" Oct 10 08:31:15 crc kubenswrapper[4732]: I1010 08:31:15.258686 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" event={"ID":"2f3c5b4f-3be5-47c7-a2af-3e85122d303b","Type":"ContainerStarted","Data":"5bb02b113bd18db2ae93d335daee7e984468426b0ea33be655a7327cccef913e"} Oct 10 08:31:15 crc kubenswrapper[4732]: I1010 08:31:15.261303 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" event={"ID":"3e0db6e5-6938-46a4-91c7-9eb584d07a5d","Type":"ContainerStarted","Data":"99f1f9c2e26ae06c095d3abfdbc852d20878eaa457fa554fd88177b592a928bb"} Oct 10 08:31:15 crc kubenswrapper[4732]: I1010 08:31:15.261521 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:15 crc kubenswrapper[4732]: I1010 08:31:15.263599 4732 patch_prober.go:28] interesting pod/observability-operator-cc5f78dfc-tds2n container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.1.122:8081/healthz\": dial tcp 10.217.1.122:8081: connect: connection refused" start-of-body= Oct 10 08:31:15 crc kubenswrapper[4732]: I1010 08:31:15.263638 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" podUID="3e0db6e5-6938-46a4-91c7-9eb584d07a5d" containerName="operator" probeResult="failure" output="Get \"http://10.217.1.122:8081/healthz\": dial tcp 10.217.1.122:8081: connect: connection refused" Oct 10 08:31:15 crc kubenswrapper[4732]: I1010 08:31:15.271271 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksrn5" event={"ID":"31624964-5c28-4579-93f4-4536eb3f2671","Type":"ContainerStarted","Data":"9c015451911a5023fabf8fdc2342cdc71b6a2628d2de674a8a755d5db9a9f463"} Oct 10 08:31:15 crc kubenswrapper[4732]: I1010 08:31:15.283119 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" podStartSLOduration=2.70708222 podStartE2EDuration="10.283097218s" podCreationTimestamp="2025-10-10 08:31:05 +0000 UTC" firstStartedPulling="2025-10-10 08:31:07.149092219 +0000 UTC m=+5994.218683460" lastFinishedPulling="2025-10-10 08:31:14.725107217 +0000 UTC m=+6001.794698458" observedRunningTime="2025-10-10 08:31:15.279130822 +0000 UTC m=+6002.348722083" watchObservedRunningTime="2025-10-10 08:31:15.283097218 +0000 UTC m=+6002.352688459" Oct 10 08:31:16 crc kubenswrapper[4732]: I1010 08:31:16.110087 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" Oct 10 08:31:16 crc kubenswrapper[4732]: I1010 08:31:16.146476 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-tds2n" podStartSLOduration=3.41473104 podStartE2EDuration="11.146455906s" podCreationTimestamp="2025-10-10 08:31:05 +0000 UTC" firstStartedPulling="2025-10-10 08:31:07.032998131 +0000 UTC m=+5994.102589372" lastFinishedPulling="2025-10-10 08:31:14.764722997 +0000 UTC m=+6001.834314238" observedRunningTime="2025-10-10 08:31:15.32054182 +0000 UTC m=+6002.390133081" watchObservedRunningTime="2025-10-10 08:31:16.146455906 +0000 UTC m=+6003.216047147" Oct 10 08:31:16 crc kubenswrapper[4732]: I1010 08:31:16.218295 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:16 crc kubenswrapper[4732]: I1010 08:31:16.281182 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" event={"ID":"b83736e0-1c5a-4160-a56a-eaaa894e994d","Type":"ContainerStarted","Data":"9670f5cf39ce87b402a650b6d346c9d98422d203e36e02b96fe74822325399f9"} Oct 10 08:31:16 crc kubenswrapper[4732]: I1010 08:31:16.284372 4732 generic.go:334] "Generic (PLEG): container finished" podID="31624964-5c28-4579-93f4-4536eb3f2671" containerID="9c015451911a5023fabf8fdc2342cdc71b6a2628d2de674a8a755d5db9a9f463" exitCode=0 Oct 10 08:31:16 crc kubenswrapper[4732]: I1010 08:31:16.284451 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksrn5" event={"ID":"31624964-5c28-4579-93f4-4536eb3f2671","Type":"ContainerDied","Data":"9c015451911a5023fabf8fdc2342cdc71b6a2628d2de674a8a755d5db9a9f463"} Oct 10 08:31:16 crc kubenswrapper[4732]: I1010 08:31:16.298869 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-2bz54" podStartSLOduration=3.649790274 podStartE2EDuration="11.298850627s" podCreationTimestamp="2025-10-10 08:31:05 +0000 UTC" firstStartedPulling="2025-10-10 08:31:07.078861979 +0000 UTC m=+5994.148453220" lastFinishedPulling="2025-10-10 08:31:14.727922332 +0000 UTC m=+6001.797513573" observedRunningTime="2025-10-10 08:31:16.297436289 +0000 UTC m=+6003.367027550" watchObservedRunningTime="2025-10-10 08:31:16.298850627 +0000 UTC m=+6003.368441878" Oct 10 08:31:17 crc kubenswrapper[4732]: I1010 08:31:17.297879 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksrn5" event={"ID":"31624964-5c28-4579-93f4-4536eb3f2671","Type":"ContainerStarted","Data":"11f5443b312edccbfab75fffb8bd3bf709de7f3b2f858549ed8e5d76bb956cb0"} Oct 10 08:31:17 crc kubenswrapper[4732]: I1010 08:31:17.324392 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ksrn5" podStartSLOduration=5.351725597 podStartE2EDuration="11.324373708s" podCreationTimestamp="2025-10-10 08:31:06 +0000 UTC" firstStartedPulling="2025-10-10 08:31:10.816440693 +0000 UTC m=+5997.886031934" lastFinishedPulling="2025-10-10 08:31:16.789088804 +0000 UTC m=+6003.858680045" observedRunningTime="2025-10-10 08:31:17.322110677 +0000 UTC m=+6004.391701918" watchObservedRunningTime="2025-10-10 08:31:17.324373708 +0000 UTC m=+6004.393964949" Oct 10 08:31:26 crc kubenswrapper[4732]: I1010 08:31:26.220824 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-cwnj8" Oct 10 08:31:27 crc kubenswrapper[4732]: I1010 08:31:27.251429 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:27 crc kubenswrapper[4732]: I1010 08:31:27.251486 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:27 crc kubenswrapper[4732]: I1010 08:31:27.298664 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:27 crc kubenswrapper[4732]: I1010 08:31:27.453567 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.835812 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.836263 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="da789cd8-70ea-4eeb-85ce-b0fc33468b8d" containerName="openstackclient" containerID="cri-o://c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d" gracePeriod=2 Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.852201 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.928076 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:28 crc kubenswrapper[4732]: E1010 08:31:28.928589 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da789cd8-70ea-4eeb-85ce-b0fc33468b8d" containerName="openstackclient" Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.928608 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="da789cd8-70ea-4eeb-85ce-b0fc33468b8d" containerName="openstackclient" Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.928842 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="da789cd8-70ea-4eeb-85ce-b0fc33468b8d" containerName="openstackclient" Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.929586 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.933729 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="da789cd8-70ea-4eeb-85ce-b0fc33468b8d" podUID="325e4c75-6232-4b1c-9e71-074c12027a24" Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.950913 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.960089 4732 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"325e4c75-6232-4b1c-9e71-074c12027a24\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T08:31:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T08:31:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T08:31:28Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-10T08:31:28Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.rdoproject.org/podified-antelope-centos9/openstack-openstackclient:c4b77291aeca5591ac860bd4127cec2f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-42xzx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-10T08:31:28Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Oct 10 08:31:28 crc kubenswrapper[4732]: I1010 08:31:28.982771 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:28 crc kubenswrapper[4732]: E1010 08:31:28.984293 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-42xzx openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="325e4c75-6232-4b1c-9e71-074c12027a24" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.007389 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42xzx\" (UniqueName: \"kubernetes.io/projected/325e4c75-6232-4b1c-9e71-074c12027a24-kube-api-access-42xzx\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.007473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.007499 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-combined-ca-bundle\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.007575 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config-secret\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.017013 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.065522 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.081392 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.111050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42xzx\" (UniqueName: \"kubernetes.io/projected/325e4c75-6232-4b1c-9e71-074c12027a24-kube-api-access-42xzx\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.111137 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.111163 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-combined-ca-bundle\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.111243 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config-secret\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.117898 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.119306 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.119587 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="325e4c75-6232-4b1c-9e71-074c12027a24" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.127406 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-combined-ca-bundle\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: E1010 08:31:29.132852 4732 projected.go:194] Error preparing data for projected volume kube-api-access-42xzx for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (325e4c75-6232-4b1c-9e71-074c12027a24) does not match the UID in record. The object might have been deleted and then recreated Oct 10 08:31:29 crc kubenswrapper[4732]: E1010 08:31:29.132949 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/325e4c75-6232-4b1c-9e71-074c12027a24-kube-api-access-42xzx podName:325e4c75-6232-4b1c-9e71-074c12027a24 nodeName:}" failed. No retries permitted until 2025-10-10 08:31:29.632927225 +0000 UTC m=+6016.702518466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-42xzx" (UniqueName: "kubernetes.io/projected/325e4c75-6232-4b1c-9e71-074c12027a24-kube-api-access-42xzx") pod "openstackclient" (UID: "325e4c75-6232-4b1c-9e71-074c12027a24") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (325e4c75-6232-4b1c-9e71-074c12027a24) does not match the UID in record. The object might have been deleted and then recreated Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.154220 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config-secret\") pod \"openstackclient\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.195193 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksrn5"] Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.213019 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.213175 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.213253 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.213288 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxl7l\" (UniqueName: \"kubernetes.io/projected/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-kube-api-access-pxl7l\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.316737 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.316867 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.316930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.316957 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxl7l\" (UniqueName: \"kubernetes.io/projected/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-kube-api-access-pxl7l\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.322292 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.325335 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config-secret\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.332617 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.364232 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxl7l\" (UniqueName: \"kubernetes.io/projected/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-kube-api-access-pxl7l\") pod \"openstackclient\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.444640 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ksrn5" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="registry-server" containerID="cri-o://11f5443b312edccbfab75fffb8bd3bf709de7f3b2f858549ed8e5d76bb956cb0" gracePeriod=2 Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.444797 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.458017 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.460918 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.489050 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zllpt" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.491914 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.524650 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="325e4c75-6232-4b1c-9e71-074c12027a24" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.527200 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qndl9\" (UniqueName: \"kubernetes.io/projected/aa613f10-ba46-4e24-8785-055e40ac96cb-kube-api-access-qndl9\") pod \"kube-state-metrics-0\" (UID: \"aa613f10-ba46-4e24-8785-055e40ac96cb\") " pod="openstack/kube-state-metrics-0" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.543496 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.572713 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.628188 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-combined-ca-bundle\") pod \"325e4c75-6232-4b1c-9e71-074c12027a24\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.628337 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config-secret\") pod \"325e4c75-6232-4b1c-9e71-074c12027a24\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.628465 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config\") pod \"325e4c75-6232-4b1c-9e71-074c12027a24\" (UID: \"325e4c75-6232-4b1c-9e71-074c12027a24\") " Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.629005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qndl9\" (UniqueName: \"kubernetes.io/projected/aa613f10-ba46-4e24-8785-055e40ac96cb-kube-api-access-qndl9\") pod \"kube-state-metrics-0\" (UID: \"aa613f10-ba46-4e24-8785-055e40ac96cb\") " pod="openstack/kube-state-metrics-0" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.629193 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42xzx\" (UniqueName: \"kubernetes.io/projected/325e4c75-6232-4b1c-9e71-074c12027a24-kube-api-access-42xzx\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.631336 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "325e4c75-6232-4b1c-9e71-074c12027a24" (UID: "325e4c75-6232-4b1c-9e71-074c12027a24"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.637136 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "325e4c75-6232-4b1c-9e71-074c12027a24" (UID: "325e4c75-6232-4b1c-9e71-074c12027a24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.652913 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "325e4c75-6232-4b1c-9e71-074c12027a24" (UID: "325e4c75-6232-4b1c-9e71-074c12027a24"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.692981 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325e4c75-6232-4b1c-9e71-074c12027a24" path="/var/lib/kubelet/pods/325e4c75-6232-4b1c-9e71-074c12027a24/volumes" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.732032 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.732363 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/325e4c75-6232-4b1c-9e71-074c12027a24-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.732378 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e4c75-6232-4b1c-9e71-074c12027a24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.767307 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qndl9\" (UniqueName: \"kubernetes.io/projected/aa613f10-ba46-4e24-8785-055e40ac96cb-kube-api-access-qndl9\") pod \"kube-state-metrics-0\" (UID: \"aa613f10-ba46-4e24-8785-055e40ac96cb\") " pod="openstack/kube-state-metrics-0" Oct 10 08:31:29 crc kubenswrapper[4732]: I1010 08:31:29.793195 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.485060 4732 generic.go:334] "Generic (PLEG): container finished" podID="31624964-5c28-4579-93f4-4536eb3f2671" containerID="11f5443b312edccbfab75fffb8bd3bf709de7f3b2f858549ed8e5d76bb956cb0" exitCode=0 Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.485364 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.486045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksrn5" event={"ID":"31624964-5c28-4579-93f4-4536eb3f2671","Type":"ContainerDied","Data":"11f5443b312edccbfab75fffb8bd3bf709de7f3b2f858549ed8e5d76bb956cb0"} Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.486069 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ksrn5" event={"ID":"31624964-5c28-4579-93f4-4536eb3f2671","Type":"ContainerDied","Data":"cba9748297ba654142f5a8122121ca6e85d62f42f6d9e9d73b2da508b04a3e78"} Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.486083 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cba9748297ba654142f5a8122121ca6e85d62f42f6d9e9d73b2da508b04a3e78" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.488663 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="325e4c75-6232-4b1c-9e71-074c12027a24" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.519985 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.520253 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="325e4c75-6232-4b1c-9e71-074c12027a24" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.524198 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.533349 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.533533 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.533781 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-nnbpr" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.534018 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.550362 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.596558 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.760373 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-catalog-content\") pod \"31624964-5c28-4579-93f4-4536eb3f2671\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.760925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-utilities\") pod \"31624964-5c28-4579-93f4-4536eb3f2671\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.761294 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bs4k\" (UniqueName: \"kubernetes.io/projected/31624964-5c28-4579-93f4-4536eb3f2671-kube-api-access-7bs4k\") pod \"31624964-5c28-4579-93f4-4536eb3f2671\" (UID: \"31624964-5c28-4579-93f4-4536eb3f2671\") " Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.774804 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-utilities" (OuterVolumeSpecName: "utilities") pod "31624964-5c28-4579-93f4-4536eb3f2671" (UID: "31624964-5c28-4579-93f4-4536eb3f2671"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.791811 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/18c40f62-0e3e-413b-ba46-ba26ea267b7f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.791903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/18c40f62-0e3e-413b-ba46-ba26ea267b7f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.792286 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtzw\" (UniqueName: \"kubernetes.io/projected/18c40f62-0e3e-413b-ba46-ba26ea267b7f-kube-api-access-cmtzw\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.792359 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/18c40f62-0e3e-413b-ba46-ba26ea267b7f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.792395 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/18c40f62-0e3e-413b-ba46-ba26ea267b7f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.792616 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/18c40f62-0e3e-413b-ba46-ba26ea267b7f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.792714 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.793003 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.793423 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31624964-5c28-4579-93f4-4536eb3f2671-kube-api-access-7bs4k" (OuterVolumeSpecName: "kube-api-access-7bs4k") pod "31624964-5c28-4579-93f4-4536eb3f2671" (UID: "31624964-5c28-4579-93f4-4536eb3f2671"). InnerVolumeSpecName "kube-api-access-7bs4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.826253 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.862241 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31624964-5c28-4579-93f4-4536eb3f2671" (UID: "31624964-5c28-4579-93f4-4536eb3f2671"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.899902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtzw\" (UniqueName: \"kubernetes.io/projected/18c40f62-0e3e-413b-ba46-ba26ea267b7f-kube-api-access-cmtzw\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.899964 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/18c40f62-0e3e-413b-ba46-ba26ea267b7f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.899985 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/18c40f62-0e3e-413b-ba46-ba26ea267b7f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.900063 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/18c40f62-0e3e-413b-ba46-ba26ea267b7f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.900089 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/18c40f62-0e3e-413b-ba46-ba26ea267b7f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.900108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/18c40f62-0e3e-413b-ba46-ba26ea267b7f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.900169 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bs4k\" (UniqueName: \"kubernetes.io/projected/31624964-5c28-4579-93f4-4536eb3f2671-kube-api-access-7bs4k\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.900180 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31624964-5c28-4579-93f4-4536eb3f2671-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.900554 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/18c40f62-0e3e-413b-ba46-ba26ea267b7f-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.904075 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/18c40f62-0e3e-413b-ba46-ba26ea267b7f-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.907898 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/18c40f62-0e3e-413b-ba46-ba26ea267b7f-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.913466 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/18c40f62-0e3e-413b-ba46-ba26ea267b7f-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.913562 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/18c40f62-0e3e-413b-ba46-ba26ea267b7f-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.916810 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtzw\" (UniqueName: \"kubernetes.io/projected/18c40f62-0e3e-413b-ba46-ba26ea267b7f-kube-api-access-cmtzw\") pod \"alertmanager-metric-storage-0\" (UID: \"18c40f62-0e3e-413b-ba46-ba26ea267b7f\") " pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.977887 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:31:30 crc kubenswrapper[4732]: E1010 08:31:30.978564 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="extract-utilities" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.978582 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="extract-utilities" Oct 10 08:31:30 crc kubenswrapper[4732]: E1010 08:31:30.978617 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="registry-server" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.978624 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="registry-server" Oct 10 08:31:30 crc kubenswrapper[4732]: E1010 08:31:30.978636 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="extract-content" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.978642 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="extract-content" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.978845 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="31624964-5c28-4579-93f4-4536eb3f2671" containerName="registry-server" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.980639 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.985845 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-llmgc" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.985845 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.986051 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.986112 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.986168 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 10 08:31:30 crc kubenswrapper[4732]: I1010 08:31:30.987630 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.014173 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.106268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxjp4\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-kube-api-access-sxjp4\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.106715 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.106869 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/72b13943-f3b6-48b7-bd72-33eec797e724-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.106995 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-config\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.107170 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.107353 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/72b13943-f3b6-48b7-bd72-33eec797e724-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.107476 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.107634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.127653 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.169924 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.209524 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config-secret\") pod \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.209593 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-combined-ca-bundle\") pod \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.209672 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqgp7\" (UniqueName: \"kubernetes.io/projected/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-kube-api-access-xqgp7\") pod \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.209839 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config\") pod \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\" (UID: \"da789cd8-70ea-4eeb-85ce-b0fc33468b8d\") " Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210206 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210238 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/72b13943-f3b6-48b7-bd72-33eec797e724-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210273 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-config\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210340 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210429 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/72b13943-f3b6-48b7-bd72-33eec797e724-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210464 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210506 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.210566 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxjp4\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-kube-api-access-sxjp4\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.212073 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/72b13943-f3b6-48b7-bd72-33eec797e724-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.217119 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.220787 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-kube-api-access-xqgp7" (OuterVolumeSpecName: "kube-api-access-xqgp7") pod "da789cd8-70ea-4eeb-85ce-b0fc33468b8d" (UID: "da789cd8-70ea-4eeb-85ce-b0fc33468b8d"). InnerVolumeSpecName "kube-api-access-xqgp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.226963 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.229819 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.230523 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.230563 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3cc69d67ea5d204bb1531707b1a0fc5e3407342e2e711ee32a5e22dfb4c3ca84/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.231258 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-config\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.241067 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/72b13943-f3b6-48b7-bd72-33eec797e724-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.269376 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxjp4\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-kube-api-access-sxjp4\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.292421 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da789cd8-70ea-4eeb-85ce-b0fc33468b8d" (UID: "da789cd8-70ea-4eeb-85ce-b0fc33468b8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.305973 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "da789cd8-70ea-4eeb-85ce-b0fc33468b8d" (UID: "da789cd8-70ea-4eeb-85ce-b0fc33468b8d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.321133 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.324253 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.324289 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqgp7\" (UniqueName: \"kubernetes.io/projected/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-kube-api-access-xqgp7\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.324303 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.497391 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "da789cd8-70ea-4eeb-85ce-b0fc33468b8d" (UID: "da789cd8-70ea-4eeb-85ce-b0fc33468b8d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.517739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90","Type":"ContainerStarted","Data":"dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34"} Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.517802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90","Type":"ContainerStarted","Data":"9fc4d9561d023afe9b0dc9168a02e48b4d328fbd9382bbf8f5c6524584cf8bff"} Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.535884 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da789cd8-70ea-4eeb-85ce-b0fc33468b8d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.541155 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"aa613f10-ba46-4e24-8785-055e40ac96cb","Type":"ContainerStarted","Data":"6edf6228ac4eb8d15ea7e2f58e0a2a102657d62cf96b3865ef23c0b1ccae3bd8"} Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.557163 4732 generic.go:334] "Generic (PLEG): container finished" podID="da789cd8-70ea-4eeb-85ce-b0fc33468b8d" containerID="c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d" exitCode=137 Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.557301 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ksrn5" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.562650 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.565360 4732 scope.go:117] "RemoveContainer" containerID="c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.609443 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.6092187940000002 podStartE2EDuration="3.609218794s" podCreationTimestamp="2025-10-10 08:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:31:31.550523343 +0000 UTC m=+6018.620114594" watchObservedRunningTime="2025-10-10 08:31:31.609218794 +0000 UTC m=+6018.678810035" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.616265 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.690828 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da789cd8-70ea-4eeb-85ce-b0fc33468b8d" path="/var/lib/kubelet/pods/da789cd8-70ea-4eeb-85ce-b0fc33468b8d/volumes" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.723091 4732 scope.go:117] "RemoveContainer" containerID="c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d" Oct 10 08:31:31 crc kubenswrapper[4732]: E1010 08:31:31.726886 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d\": container with ID starting with c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d not found: ID does not exist" containerID="c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.726925 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d"} err="failed to get container status \"c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d\": rpc error: code = NotFound desc = could not find container \"c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d\": container with ID starting with c322775849cbf2bef8ba49a253ef8a90574eeaba91e9127444b6b3c911b8b94d not found: ID does not exist" Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.752572 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksrn5"] Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.766342 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ksrn5"] Oct 10 08:31:31 crc kubenswrapper[4732]: I1010 08:31:31.888210 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 10 08:31:31 crc kubenswrapper[4732]: W1010 08:31:31.891778 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c40f62_0e3e_413b_ba46_ba26ea267b7f.slice/crio-9de69654d72d859f2fc6f23aa7361b358ef72c3e6c2c9ecdfafc9b5ae31f86db WatchSource:0}: Error finding container 9de69654d72d859f2fc6f23aa7361b358ef72c3e6c2c9ecdfafc9b5ae31f86db: Status 404 returned error can't find the container with id 9de69654d72d859f2fc6f23aa7361b358ef72c3e6c2c9ecdfafc9b5ae31f86db Oct 10 08:31:32 crc kubenswrapper[4732]: I1010 08:31:32.210627 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:31:32 crc kubenswrapper[4732]: W1010 08:31:32.213677 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72b13943_f3b6_48b7_bd72_33eec797e724.slice/crio-a14b6d86603c9f61759afe1d9efeff1be17ee8b1bf4b21dde1f35444dfb25671 WatchSource:0}: Error finding container a14b6d86603c9f61759afe1d9efeff1be17ee8b1bf4b21dde1f35444dfb25671: Status 404 returned error can't find the container with id a14b6d86603c9f61759afe1d9efeff1be17ee8b1bf4b21dde1f35444dfb25671 Oct 10 08:31:32 crc kubenswrapper[4732]: I1010 08:31:32.569218 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"18c40f62-0e3e-413b-ba46-ba26ea267b7f","Type":"ContainerStarted","Data":"9de69654d72d859f2fc6f23aa7361b358ef72c3e6c2c9ecdfafc9b5ae31f86db"} Oct 10 08:31:32 crc kubenswrapper[4732]: I1010 08:31:32.571506 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"aa613f10-ba46-4e24-8785-055e40ac96cb","Type":"ContainerStarted","Data":"40b3998a24e004f183069cac1cb70a9af429501def31e5e9c9774bf841d27889"} Oct 10 08:31:32 crc kubenswrapper[4732]: I1010 08:31:32.571652 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 10 08:31:32 crc kubenswrapper[4732]: I1010 08:31:32.573146 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerStarted","Data":"a14b6d86603c9f61759afe1d9efeff1be17ee8b1bf4b21dde1f35444dfb25671"} Oct 10 08:31:32 crc kubenswrapper[4732]: I1010 08:31:32.597743 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.148181238 podStartE2EDuration="3.597723895s" podCreationTimestamp="2025-10-10 08:31:29 +0000 UTC" firstStartedPulling="2025-10-10 08:31:30.801346562 +0000 UTC m=+6017.870937803" lastFinishedPulling="2025-10-10 08:31:31.250889219 +0000 UTC m=+6018.320480460" observedRunningTime="2025-10-10 08:31:32.584990694 +0000 UTC m=+6019.654581955" watchObservedRunningTime="2025-10-10 08:31:32.597723895 +0000 UTC m=+6019.667315136" Oct 10 08:31:33 crc kubenswrapper[4732]: I1010 08:31:33.677334 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31624964-5c28-4579-93f4-4536eb3f2671" path="/var/lib/kubelet/pods/31624964-5c28-4579-93f4-4536eb3f2671/volumes" Oct 10 08:31:35 crc kubenswrapper[4732]: I1010 08:31:35.151251 4732 scope.go:117] "RemoveContainer" containerID="2aa093add240e8511c83d6b028157319f00bd051a5e9cb68dd4aa4480811b567" Oct 10 08:31:35 crc kubenswrapper[4732]: I1010 08:31:35.171781 4732 scope.go:117] "RemoveContainer" containerID="2f3215762fcf4416eb0d4faa62ff1ca6bd6935426fb5aa9183eddcde9db80fe3" Oct 10 08:31:38 crc kubenswrapper[4732]: I1010 08:31:38.631637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerStarted","Data":"540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231"} Oct 10 08:31:38 crc kubenswrapper[4732]: I1010 08:31:38.633771 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"18c40f62-0e3e-413b-ba46-ba26ea267b7f","Type":"ContainerStarted","Data":"eeb30c9a5acee61c1eb77b66a61602a8acdd292b72b9c137d276e9ec86e54938"} Oct 10 08:31:39 crc kubenswrapper[4732]: I1010 08:31:39.799322 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 10 08:31:46 crc kubenswrapper[4732]: I1010 08:31:46.707590 4732 generic.go:334] "Generic (PLEG): container finished" podID="72b13943-f3b6-48b7-bd72-33eec797e724" containerID="540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231" exitCode=0 Oct 10 08:31:46 crc kubenswrapper[4732]: I1010 08:31:46.707640 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerDied","Data":"540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231"} Oct 10 08:31:46 crc kubenswrapper[4732]: I1010 08:31:46.713141 4732 generic.go:334] "Generic (PLEG): container finished" podID="18c40f62-0e3e-413b-ba46-ba26ea267b7f" containerID="eeb30c9a5acee61c1eb77b66a61602a8acdd292b72b9c137d276e9ec86e54938" exitCode=0 Oct 10 08:31:46 crc kubenswrapper[4732]: I1010 08:31:46.713181 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"18c40f62-0e3e-413b-ba46-ba26ea267b7f","Type":"ContainerDied","Data":"eeb30c9a5acee61c1eb77b66a61602a8acdd292b72b9c137d276e9ec86e54938"} Oct 10 08:31:49 crc kubenswrapper[4732]: I1010 08:31:49.749301 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"18c40f62-0e3e-413b-ba46-ba26ea267b7f","Type":"ContainerStarted","Data":"655a712238f6f6d3798f821ee3af2c614682f4c9dad42d4e8eda1d627af58609"} Oct 10 08:31:54 crc kubenswrapper[4732]: I1010 08:31:54.065999 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9hfhx"] Oct 10 08:31:54 crc kubenswrapper[4732]: I1010 08:31:54.074562 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9hfhx"] Oct 10 08:31:54 crc kubenswrapper[4732]: I1010 08:31:54.804726 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerStarted","Data":"28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778"} Oct 10 08:31:54 crc kubenswrapper[4732]: I1010 08:31:54.808282 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"18c40f62-0e3e-413b-ba46-ba26ea267b7f","Type":"ContainerStarted","Data":"6f420bcea6e49637fc40e9d4a3356edd44541f59346a42154a5c6e939c128fd8"} Oct 10 08:31:54 crc kubenswrapper[4732]: I1010 08:31:54.809369 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:54 crc kubenswrapper[4732]: I1010 08:31:54.813093 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 10 08:31:54 crc kubenswrapper[4732]: I1010 08:31:54.833824 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.00614281 podStartE2EDuration="24.833796607s" podCreationTimestamp="2025-10-10 08:31:30 +0000 UTC" firstStartedPulling="2025-10-10 08:31:31.895320546 +0000 UTC m=+6018.964911787" lastFinishedPulling="2025-10-10 08:31:48.722974303 +0000 UTC m=+6035.792565584" observedRunningTime="2025-10-10 08:31:54.831921276 +0000 UTC m=+6041.901512527" watchObservedRunningTime="2025-10-10 08:31:54.833796607 +0000 UTC m=+6041.903387848" Oct 10 08:31:55 crc kubenswrapper[4732]: I1010 08:31:55.671336 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ef503bf-2e69-4fdc-89d6-0ab09f364bbb" path="/var/lib/kubelet/pods/0ef503bf-2e69-4fdc-89d6-0ab09f364bbb/volumes" Oct 10 08:31:58 crc kubenswrapper[4732]: I1010 08:31:58.855054 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerStarted","Data":"605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf"} Oct 10 08:32:01 crc kubenswrapper[4732]: I1010 08:32:01.915354 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerStarted","Data":"f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5"} Oct 10 08:32:01 crc kubenswrapper[4732]: I1010 08:32:01.941640 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.543909224 podStartE2EDuration="32.941615338s" podCreationTimestamp="2025-10-10 08:31:29 +0000 UTC" firstStartedPulling="2025-10-10 08:31:32.21774381 +0000 UTC m=+6019.287335051" lastFinishedPulling="2025-10-10 08:32:01.615449924 +0000 UTC m=+6048.685041165" observedRunningTime="2025-10-10 08:32:01.94018801 +0000 UTC m=+6049.009779271" watchObservedRunningTime="2025-10-10 08:32:01.941615338 +0000 UTC m=+6049.011206589" Oct 10 08:32:06 crc kubenswrapper[4732]: I1010 08:32:06.617375 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.678728 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.683210 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.686073 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.686304 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.697682 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.805271 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-scripts\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.805409 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qqx6\" (UniqueName: \"kubernetes.io/projected/440be690-fb57-4a22-be01-05a8113a84a6-kube-api-access-6qqx6\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.805463 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-log-httpd\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.805650 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-config-data\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.805836 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.805982 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.806068 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-run-httpd\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.909463 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.909534 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.909571 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-run-httpd\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.909630 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-scripts\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.909669 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qqx6\" (UniqueName: \"kubernetes.io/projected/440be690-fb57-4a22-be01-05a8113a84a6-kube-api-access-6qqx6\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.909718 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-log-httpd\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.909761 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-config-data\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.910973 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-run-httpd\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.911261 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-log-httpd\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.916732 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-config-data\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.916757 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-scripts\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.917113 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.917264 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:10 crc kubenswrapper[4732]: I1010 08:32:10.928327 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qqx6\" (UniqueName: \"kubernetes.io/projected/440be690-fb57-4a22-be01-05a8113a84a6-kube-api-access-6qqx6\") pod \"ceilometer-0\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " pod="openstack/ceilometer-0" Oct 10 08:32:11 crc kubenswrapper[4732]: I1010 08:32:11.010090 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:32:11 crc kubenswrapper[4732]: W1010 08:32:11.477414 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod440be690_fb57_4a22_be01_05a8113a84a6.slice/crio-2803bff84de7355bc95408b6682b7ba9ab0329bbd10b7cc90d234034de943ce6 WatchSource:0}: Error finding container 2803bff84de7355bc95408b6682b7ba9ab0329bbd10b7cc90d234034de943ce6: Status 404 returned error can't find the container with id 2803bff84de7355bc95408b6682b7ba9ab0329bbd10b7cc90d234034de943ce6 Oct 10 08:32:11 crc kubenswrapper[4732]: I1010 08:32:11.479652 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:12 crc kubenswrapper[4732]: I1010 08:32:12.009007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerStarted","Data":"2803bff84de7355bc95408b6682b7ba9ab0329bbd10b7cc90d234034de943ce6"} Oct 10 08:32:16 crc kubenswrapper[4732]: I1010 08:32:16.051564 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerStarted","Data":"8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e"} Oct 10 08:32:16 crc kubenswrapper[4732]: I1010 08:32:16.618136 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:16 crc kubenswrapper[4732]: I1010 08:32:16.620430 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:17 crc kubenswrapper[4732]: I1010 08:32:17.063306 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerStarted","Data":"c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6"} Oct 10 08:32:17 crc kubenswrapper[4732]: I1010 08:32:17.064590 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.074066 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerStarted","Data":"c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643"} Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.413286 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.413587 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" containerName="openstackclient" containerID="cri-o://dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34" gracePeriod=2 Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.421264 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" podUID="302e339c-dd64-4aa3-8801-6cc7d9fa93aa" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.445228 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.459972 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: E1010 08:32:18.460662 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" containerName="openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.460681 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" containerName="openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.461135 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" containerName="openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.462755 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.489657 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.511131 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: E1010 08:32:18.512213 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-jgjrp openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-jgjrp openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="302e339c-dd64-4aa3-8801-6cc7d9fa93aa" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.520744 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.534896 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.538649 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.541418 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="302e339c-dd64-4aa3-8801-6cc7d9fa93aa" podUID="c449ab79-27fb-47ce-8e4c-fc160420cddf" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.547104 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.679091 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c449ab79-27fb-47ce-8e4c-fc160420cddf-openstack-config-secret\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.679140 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c449ab79-27fb-47ce-8e4c-fc160420cddf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.679265 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znph\" (UniqueName: \"kubernetes.io/projected/c449ab79-27fb-47ce-8e4c-fc160420cddf-kube-api-access-5znph\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.679299 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c449ab79-27fb-47ce-8e4c-fc160420cddf-openstack-config\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.780829 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znph\" (UniqueName: \"kubernetes.io/projected/c449ab79-27fb-47ce-8e4c-fc160420cddf-kube-api-access-5znph\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.780913 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c449ab79-27fb-47ce-8e4c-fc160420cddf-openstack-config\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.781017 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c449ab79-27fb-47ce-8e4c-fc160420cddf-openstack-config-secret\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.781061 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c449ab79-27fb-47ce-8e4c-fc160420cddf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.782832 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c449ab79-27fb-47ce-8e4c-fc160420cddf-openstack-config\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.787231 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c449ab79-27fb-47ce-8e4c-fc160420cddf-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.792785 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c449ab79-27fb-47ce-8e4c-fc160420cddf-openstack-config-secret\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.800020 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znph\" (UniqueName: \"kubernetes.io/projected/c449ab79-27fb-47ce-8e4c-fc160420cddf-kube-api-access-5znph\") pod \"openstackclient\" (UID: \"c449ab79-27fb-47ce-8e4c-fc160420cddf\") " pod="openstack/openstackclient" Oct 10 08:32:18 crc kubenswrapper[4732]: I1010 08:32:18.864941 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.082499 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.090097 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="302e339c-dd64-4aa3-8801-6cc7d9fa93aa" podUID="c449ab79-27fb-47ce-8e4c-fc160420cddf" Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.095866 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.101358 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="302e339c-dd64-4aa3-8801-6cc7d9fa93aa" podUID="c449ab79-27fb-47ce-8e4c-fc160420cddf" Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.446720 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 10 08:32:19 crc kubenswrapper[4732]: W1010 08:32:19.450046 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc449ab79_27fb_47ce_8e4c_fc160420cddf.slice/crio-1d1d8bcdaabdad05377c383f55926725cde289b2e31dc67cdc7d1b5a49abf8c0 WatchSource:0}: Error finding container 1d1d8bcdaabdad05377c383f55926725cde289b2e31dc67cdc7d1b5a49abf8c0: Status 404 returned error can't find the container with id 1d1d8bcdaabdad05377c383f55926725cde289b2e31dc67cdc7d1b5a49abf8c0 Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.674077 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302e339c-dd64-4aa3-8801-6cc7d9fa93aa" path="/var/lib/kubelet/pods/302e339c-dd64-4aa3-8801-6cc7d9fa93aa/volumes" Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.741851 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.742091 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="prometheus" containerID="cri-o://28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778" gracePeriod=600 Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.742191 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="config-reloader" containerID="cri-o://605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf" gracePeriod=600 Oct 10 08:32:19 crc kubenswrapper[4732]: I1010 08:32:19.742196 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="thanos-sidecar" containerID="cri-o://f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5" gracePeriod=600 Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.099554 4732 generic.go:334] "Generic (PLEG): container finished" podID="72b13943-f3b6-48b7-bd72-33eec797e724" containerID="f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5" exitCode=0 Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.099612 4732 generic.go:334] "Generic (PLEG): container finished" podID="72b13943-f3b6-48b7-bd72-33eec797e724" containerID="28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778" exitCode=0 Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.099656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerDied","Data":"f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5"} Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.099731 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerDied","Data":"28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778"} Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.105285 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c449ab79-27fb-47ce-8e4c-fc160420cddf","Type":"ContainerStarted","Data":"2272a64c7b14b0d451c80a4bb8bb1b6afb69d63de19d121a60ef6971dfb99a60"} Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.105336 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c449ab79-27fb-47ce-8e4c-fc160420cddf","Type":"ContainerStarted","Data":"1d1d8bcdaabdad05377c383f55926725cde289b2e31dc67cdc7d1b5a49abf8c0"} Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.112805 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.112916 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerStarted","Data":"2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02"} Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.124368 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.12432749 podStartE2EDuration="2.12432749s" podCreationTimestamp="2025-10-10 08:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:32:20.120986281 +0000 UTC m=+6067.190577532" watchObservedRunningTime="2025-10-10 08:32:20.12432749 +0000 UTC m=+6067.193918731" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.129177 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="302e339c-dd64-4aa3-8801-6cc7d9fa93aa" podUID="c449ab79-27fb-47ce-8e4c-fc160420cddf" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.156302 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.304979695 podStartE2EDuration="10.156286066s" podCreationTimestamp="2025-10-10 08:32:10 +0000 UTC" firstStartedPulling="2025-10-10 08:32:11.480651281 +0000 UTC m=+6058.550242522" lastFinishedPulling="2025-10-10 08:32:19.331957652 +0000 UTC m=+6066.401548893" observedRunningTime="2025-10-10 08:32:20.148353173 +0000 UTC m=+6067.217944434" watchObservedRunningTime="2025-10-10 08:32:20.156286066 +0000 UTC m=+6067.225877307" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.700789 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.727460 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-web-config\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825329 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-combined-ca-bundle\") pod \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825543 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825572 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-config\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825594 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/72b13943-f3b6-48b7-bd72-33eec797e724-prometheus-metric-storage-rulefiles-0\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825628 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxl7l\" (UniqueName: \"kubernetes.io/projected/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-kube-api-access-pxl7l\") pod \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825796 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config-secret\") pod \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825849 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-thanos-prometheus-http-client-file\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825876 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-tls-assets\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825909 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxjp4\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-kube-api-access-sxjp4\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825932 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/72b13943-f3b6-48b7-bd72-33eec797e724-config-out\") pod \"72b13943-f3b6-48b7-bd72-33eec797e724\" (UID: \"72b13943-f3b6-48b7-bd72-33eec797e724\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.825990 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config\") pod \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\" (UID: \"a7cff40a-f51f-4eb8-8ae9-f59af55a4e90\") " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.830475 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72b13943-f3b6-48b7-bd72-33eec797e724-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.835186 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-kube-api-access-sxjp4" (OuterVolumeSpecName: "kube-api-access-sxjp4") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "kube-api-access-sxjp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.838710 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.838145 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72b13943-f3b6-48b7-bd72-33eec797e724-config-out" (OuterVolumeSpecName: "config-out") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.838256 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-config" (OuterVolumeSpecName: "config") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.835570 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.847157 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-kube-api-access-pxl7l" (OuterVolumeSpecName: "kube-api-access-pxl7l") pod "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" (UID: "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90"). InnerVolumeSpecName "kube-api-access-pxl7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.872304 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "pvc-9748a009-32df-4212-9452-772ce2d8a551". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.877289 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-web-config" (OuterVolumeSpecName: "web-config") pod "72b13943-f3b6-48b7-bd72-33eec797e724" (UID: "72b13943-f3b6-48b7-bd72-33eec797e724"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.887981 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" (UID: "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.919420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" (UID: "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930413 4732 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/72b13943-f3b6-48b7-bd72-33eec797e724-config-out\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930456 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930467 4732 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-web-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930476 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930507 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") on node \"crc\" " Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930520 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930530 4732 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/72b13943-f3b6-48b7-bd72-33eec797e724-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930546 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxl7l\" (UniqueName: \"kubernetes.io/projected/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-kube-api-access-pxl7l\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930558 4732 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/72b13943-f3b6-48b7-bd72-33eec797e724-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930568 4732 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.930580 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxjp4\" (UniqueName: \"kubernetes.io/projected/72b13943-f3b6-48b7-bd72-33eec797e724-kube-api-access-sxjp4\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.936366 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" (UID: "a7cff40a-f51f-4eb8-8ae9-f59af55a4e90"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.969627 4732 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 08:32:20 crc kubenswrapper[4732]: I1010 08:32:20.969914 4732 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-9748a009-32df-4212-9452-772ce2d8a551" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551") on node "crc" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.031850 4732 reconciler_common.go:293] "Volume detached for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.031883 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.123865 4732 generic.go:334] "Generic (PLEG): container finished" podID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" containerID="dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34" exitCode=137 Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.123986 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.124189 4732 scope.go:117] "RemoveContainer" containerID="dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.131353 4732 generic.go:334] "Generic (PLEG): container finished" podID="72b13943-f3b6-48b7-bd72-33eec797e724" containerID="605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf" exitCode=0 Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.131678 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerDied","Data":"605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf"} Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.131771 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"72b13943-f3b6-48b7-bd72-33eec797e724","Type":"ContainerDied","Data":"a14b6d86603c9f61759afe1d9efeff1be17ee8b1bf4b21dde1f35444dfb25671"} Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.133196 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.133288 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.142618 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" podUID="c449ab79-27fb-47ce-8e4c-fc160420cddf" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.169213 4732 scope.go:117] "RemoveContainer" containerID="dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.171911 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34\": container with ID starting with dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34 not found: ID does not exist" containerID="dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.171957 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34"} err="failed to get container status \"dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34\": rpc error: code = NotFound desc = could not find container \"dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34\": container with ID starting with dc478976d84460c592818e402136aff2f013e59540c579b0760165dcb834dc34 not found: ID does not exist" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.171979 4732 scope.go:117] "RemoveContainer" containerID="f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.197770 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.204133 4732 scope.go:117] "RemoveContainer" containerID="605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.210478 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.227133 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.228162 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="thanos-sidecar" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.228186 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="thanos-sidecar" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.228201 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="init-config-reloader" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.228210 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="init-config-reloader" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.228243 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="prometheus" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.228250 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="prometheus" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.228279 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="config-reloader" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.228286 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="config-reloader" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.228523 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="thanos-sidecar" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.228557 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="prometheus" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.228574 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" containerName="config-reloader" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.230992 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.232945 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.237915 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.238249 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.238515 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.238911 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.240190 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-llmgc" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.241817 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.244917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.245748 4732 scope.go:117] "RemoveContainer" containerID="28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.287703 4732 scope.go:117] "RemoveContainer" containerID="540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.315165 4732 scope.go:117] "RemoveContainer" containerID="f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.315648 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5\": container with ID starting with f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5 not found: ID does not exist" containerID="f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.315788 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5"} err="failed to get container status \"f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5\": rpc error: code = NotFound desc = could not find container \"f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5\": container with ID starting with f1aa81a2e7c2f48616523dcf110b94778f2b303385ffa4997cb401becd7ca6f5 not found: ID does not exist" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.315820 4732 scope.go:117] "RemoveContainer" containerID="605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.316118 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf\": container with ID starting with 605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf not found: ID does not exist" containerID="605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.316142 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf"} err="failed to get container status \"605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf\": rpc error: code = NotFound desc = could not find container \"605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf\": container with ID starting with 605e7eab6f86bf055b798932d603295c67db7cece7348db50b82deb2e719e3bf not found: ID does not exist" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.316156 4732 scope.go:117] "RemoveContainer" containerID="28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.316387 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778\": container with ID starting with 28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778 not found: ID does not exist" containerID="28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.316414 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778"} err="failed to get container status \"28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778\": rpc error: code = NotFound desc = could not find container \"28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778\": container with ID starting with 28a85a34107bd2d6e4fcdf909446ce2f07d771d08bf9ede2bc92de4a43082778 not found: ID does not exist" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.316433 4732 scope.go:117] "RemoveContainer" containerID="540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231" Oct 10 08:32:21 crc kubenswrapper[4732]: E1010 08:32:21.316618 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231\": container with ID starting with 540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231 not found: ID does not exist" containerID="540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.316643 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231"} err="failed to get container status \"540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231\": rpc error: code = NotFound desc = could not find container \"540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231\": container with ID starting with 540e3bc44a027639b4b90b45bf365b8244595b4d67716dfeaa9126a791cca231 not found: ID does not exist" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.337617 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.338306 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.338378 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.338473 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cr74\" (UniqueName: \"kubernetes.io/projected/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-kube-api-access-7cr74\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.338592 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.338676 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.338809 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.342140 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.342797 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.342882 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.343138 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-config\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.444597 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.444730 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.444769 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.444815 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cr74\" (UniqueName: \"kubernetes.io/projected/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-kube-api-access-7cr74\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.444874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.444912 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.444937 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.445014 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.445067 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.445090 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.445141 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-config\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.446111 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.450863 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.452382 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.452491 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.452583 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.452640 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3cc69d67ea5d204bb1531707b1a0fc5e3407342e2e711ee32a5e22dfb4c3ca84/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.452719 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.455071 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-config\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.457204 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.460380 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.461043 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.463487 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cr74\" (UniqueName: \"kubernetes.io/projected/0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe-kube-api-access-7cr74\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.495415 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9748a009-32df-4212-9452-772ce2d8a551\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9748a009-32df-4212-9452-772ce2d8a551\") pod \"prometheus-metric-storage-0\" (UID: \"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe\") " pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.555036 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.711120 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72b13943-f3b6-48b7-bd72-33eec797e724" path="/var/lib/kubelet/pods/72b13943-f3b6-48b7-bd72-33eec797e724/volumes" Oct 10 08:32:21 crc kubenswrapper[4732]: I1010 08:32:21.712225 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7cff40a-f51f-4eb8-8ae9-f59af55a4e90" path="/var/lib/kubelet/pods/a7cff40a-f51f-4eb8-8ae9-f59af55a4e90/volumes" Oct 10 08:32:22 crc kubenswrapper[4732]: I1010 08:32:22.086412 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 10 08:32:22 crc kubenswrapper[4732]: W1010 08:32:22.091715 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d5ec3e6_9faf_4457_a0ef_8050e9c8cabe.slice/crio-0f5800ea5525c3b2ebbdd9881285f008b59055ed09b93cf2ce38f7e3b5a91593 WatchSource:0}: Error finding container 0f5800ea5525c3b2ebbdd9881285f008b59055ed09b93cf2ce38f7e3b5a91593: Status 404 returned error can't find the container with id 0f5800ea5525c3b2ebbdd9881285f008b59055ed09b93cf2ce38f7e3b5a91593 Oct 10 08:32:22 crc kubenswrapper[4732]: I1010 08:32:22.144794 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe","Type":"ContainerStarted","Data":"0f5800ea5525c3b2ebbdd9881285f008b59055ed09b93cf2ce38f7e3b5a91593"} Oct 10 08:32:25 crc kubenswrapper[4732]: I1010 08:32:25.355980 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:32:25 crc kubenswrapper[4732]: I1010 08:32:25.356509 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:32:25 crc kubenswrapper[4732]: I1010 08:32:25.986422 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-628n7"] Oct 10 08:32:25 crc kubenswrapper[4732]: I1010 08:32:25.988980 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-628n7" Oct 10 08:32:26 crc kubenswrapper[4732]: I1010 08:32:26.008849 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-628n7"] Oct 10 08:32:26 crc kubenswrapper[4732]: I1010 08:32:26.063029 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpnq\" (UniqueName: \"kubernetes.io/projected/d0d03672-696b-4292-a40d-5b94573eed55-kube-api-access-xhpnq\") pod \"aodh-db-create-628n7\" (UID: \"d0d03672-696b-4292-a40d-5b94573eed55\") " pod="openstack/aodh-db-create-628n7" Oct 10 08:32:26 crc kubenswrapper[4732]: I1010 08:32:26.165819 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpnq\" (UniqueName: \"kubernetes.io/projected/d0d03672-696b-4292-a40d-5b94573eed55-kube-api-access-xhpnq\") pod \"aodh-db-create-628n7\" (UID: \"d0d03672-696b-4292-a40d-5b94573eed55\") " pod="openstack/aodh-db-create-628n7" Oct 10 08:32:26 crc kubenswrapper[4732]: I1010 08:32:26.179877 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe","Type":"ContainerStarted","Data":"dd7c989bbaf1b3db79901e124c8b6216fd50f2b62b62a4d7ada3f5a32d1ebdc2"} Oct 10 08:32:26 crc kubenswrapper[4732]: I1010 08:32:26.184521 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpnq\" (UniqueName: \"kubernetes.io/projected/d0d03672-696b-4292-a40d-5b94573eed55-kube-api-access-xhpnq\") pod \"aodh-db-create-628n7\" (UID: \"d0d03672-696b-4292-a40d-5b94573eed55\") " pod="openstack/aodh-db-create-628n7" Oct 10 08:32:26 crc kubenswrapper[4732]: I1010 08:32:26.314399 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-628n7" Oct 10 08:32:26 crc kubenswrapper[4732]: I1010 08:32:26.815012 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-628n7"] Oct 10 08:32:26 crc kubenswrapper[4732]: W1010 08:32:26.817906 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0d03672_696b_4292_a40d_5b94573eed55.slice/crio-1dcba3c07db324683ed01eb381c3b8497bc94089f9b132bbe36a65b65000a7ba WatchSource:0}: Error finding container 1dcba3c07db324683ed01eb381c3b8497bc94089f9b132bbe36a65b65000a7ba: Status 404 returned error can't find the container with id 1dcba3c07db324683ed01eb381c3b8497bc94089f9b132bbe36a65b65000a7ba Oct 10 08:32:27 crc kubenswrapper[4732]: I1010 08:32:27.193679 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0d03672-696b-4292-a40d-5b94573eed55" containerID="9def859f4adfd1bc44c46496404ccb26fa55c335b4d23459b98d70dff8cd6c26" exitCode=0 Oct 10 08:32:27 crc kubenswrapper[4732]: I1010 08:32:27.193957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-628n7" event={"ID":"d0d03672-696b-4292-a40d-5b94573eed55","Type":"ContainerDied","Data":"9def859f4adfd1bc44c46496404ccb26fa55c335b4d23459b98d70dff8cd6c26"} Oct 10 08:32:27 crc kubenswrapper[4732]: I1010 08:32:27.194823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-628n7" event={"ID":"d0d03672-696b-4292-a40d-5b94573eed55","Type":"ContainerStarted","Data":"1dcba3c07db324683ed01eb381c3b8497bc94089f9b132bbe36a65b65000a7ba"} Oct 10 08:32:28 crc kubenswrapper[4732]: I1010 08:32:28.588133 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-628n7" Oct 10 08:32:28 crc kubenswrapper[4732]: I1010 08:32:28.718444 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhpnq\" (UniqueName: \"kubernetes.io/projected/d0d03672-696b-4292-a40d-5b94573eed55-kube-api-access-xhpnq\") pod \"d0d03672-696b-4292-a40d-5b94573eed55\" (UID: \"d0d03672-696b-4292-a40d-5b94573eed55\") " Oct 10 08:32:28 crc kubenswrapper[4732]: I1010 08:32:28.726256 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d03672-696b-4292-a40d-5b94573eed55-kube-api-access-xhpnq" (OuterVolumeSpecName: "kube-api-access-xhpnq") pod "d0d03672-696b-4292-a40d-5b94573eed55" (UID: "d0d03672-696b-4292-a40d-5b94573eed55"). InnerVolumeSpecName "kube-api-access-xhpnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:28 crc kubenswrapper[4732]: I1010 08:32:28.821131 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhpnq\" (UniqueName: \"kubernetes.io/projected/d0d03672-696b-4292-a40d-5b94573eed55-kube-api-access-xhpnq\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:29 crc kubenswrapper[4732]: I1010 08:32:29.215684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-628n7" event={"ID":"d0d03672-696b-4292-a40d-5b94573eed55","Type":"ContainerDied","Data":"1dcba3c07db324683ed01eb381c3b8497bc94089f9b132bbe36a65b65000a7ba"} Oct 10 08:32:29 crc kubenswrapper[4732]: I1010 08:32:29.215753 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dcba3c07db324683ed01eb381c3b8497bc94089f9b132bbe36a65b65000a7ba" Oct 10 08:32:29 crc kubenswrapper[4732]: I1010 08:32:29.215779 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-628n7" Oct 10 08:32:33 crc kubenswrapper[4732]: I1010 08:32:33.269663 4732 generic.go:334] "Generic (PLEG): container finished" podID="0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe" containerID="dd7c989bbaf1b3db79901e124c8b6216fd50f2b62b62a4d7ada3f5a32d1ebdc2" exitCode=0 Oct 10 08:32:33 crc kubenswrapper[4732]: I1010 08:32:33.269777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe","Type":"ContainerDied","Data":"dd7c989bbaf1b3db79901e124c8b6216fd50f2b62b62a4d7ada3f5a32d1ebdc2"} Oct 10 08:32:34 crc kubenswrapper[4732]: I1010 08:32:34.291658 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe","Type":"ContainerStarted","Data":"4e15698a74d32d552b6cf719e200b60d8ac790c6b336371c6cc4af337a7f0a78"} Oct 10 08:32:35 crc kubenswrapper[4732]: I1010 08:32:35.312012 4732 scope.go:117] "RemoveContainer" containerID="e73f07b03688f6e641c4a4b31296aa8f8ea841289cfddb1cf02292012c0d0749" Oct 10 08:32:35 crc kubenswrapper[4732]: I1010 08:32:35.340739 4732 scope.go:117] "RemoveContainer" containerID="f94df7a0d8fc8d013190033d8a5b7e6b4cf69aaeeb8cf4b878c8bb8fd03b84ed" Oct 10 08:32:35 crc kubenswrapper[4732]: I1010 08:32:35.389510 4732 scope.go:117] "RemoveContainer" containerID="c8a825e728af6abc8dd7867631f639827e3c69cab4bb505fcd946900f82ab7ff" Oct 10 08:32:35 crc kubenswrapper[4732]: I1010 08:32:35.431732 4732 scope.go:117] "RemoveContainer" containerID="75b4a4eb6e9af1da58b504db37375eda1c9467b6c6bd15631730c57bb1029754" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.012582 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-709f-account-create-gbtdf"] Oct 10 08:32:36 crc kubenswrapper[4732]: E1010 08:32:36.013096 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d03672-696b-4292-a40d-5b94573eed55" containerName="mariadb-database-create" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.013113 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d03672-696b-4292-a40d-5b94573eed55" containerName="mariadb-database-create" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.013314 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d03672-696b-4292-a40d-5b94573eed55" containerName="mariadb-database-create" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.014071 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-709f-account-create-gbtdf" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.016998 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.022448 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-709f-account-create-gbtdf"] Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.078553 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wsw\" (UniqueName: \"kubernetes.io/projected/69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00-kube-api-access-n5wsw\") pod \"aodh-709f-account-create-gbtdf\" (UID: \"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00\") " pod="openstack/aodh-709f-account-create-gbtdf" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.180663 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wsw\" (UniqueName: \"kubernetes.io/projected/69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00-kube-api-access-n5wsw\") pod \"aodh-709f-account-create-gbtdf\" (UID: \"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00\") " pod="openstack/aodh-709f-account-create-gbtdf" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.200817 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wsw\" (UniqueName: \"kubernetes.io/projected/69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00-kube-api-access-n5wsw\") pod \"aodh-709f-account-create-gbtdf\" (UID: \"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00\") " pod="openstack/aodh-709f-account-create-gbtdf" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.338578 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-709f-account-create-gbtdf" Oct 10 08:32:36 crc kubenswrapper[4732]: I1010 08:32:36.861348 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-709f-account-create-gbtdf"] Oct 10 08:32:36 crc kubenswrapper[4732]: W1010 08:32:36.866849 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69eb5ef8_a699_4c3b_b2be_b7eb9be7ac00.slice/crio-12b598125ba40443739dd3d8e61b391e0ed5f5ceefecc2bb835174f3050752c1 WatchSource:0}: Error finding container 12b598125ba40443739dd3d8e61b391e0ed5f5ceefecc2bb835174f3050752c1: Status 404 returned error can't find the container with id 12b598125ba40443739dd3d8e61b391e0ed5f5ceefecc2bb835174f3050752c1 Oct 10 08:32:37 crc kubenswrapper[4732]: I1010 08:32:37.326994 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe","Type":"ContainerStarted","Data":"4d36a00dc23f1527593d1a7383172abed758c59186d8fc629eea41c6a8d6f28c"} Oct 10 08:32:37 crc kubenswrapper[4732]: I1010 08:32:37.328999 4732 generic.go:334] "Generic (PLEG): container finished" podID="69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00" containerID="b17ff3f177c72c8267d992f014a65debc50db84e79bfc5404cf3457faa207a63" exitCode=0 Oct 10 08:32:37 crc kubenswrapper[4732]: I1010 08:32:37.329044 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-709f-account-create-gbtdf" event={"ID":"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00","Type":"ContainerDied","Data":"b17ff3f177c72c8267d992f014a65debc50db84e79bfc5404cf3457faa207a63"} Oct 10 08:32:37 crc kubenswrapper[4732]: I1010 08:32:37.329093 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-709f-account-create-gbtdf" event={"ID":"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00","Type":"ContainerStarted","Data":"12b598125ba40443739dd3d8e61b391e0ed5f5ceefecc2bb835174f3050752c1"} Oct 10 08:32:38 crc kubenswrapper[4732]: I1010 08:32:38.348205 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe","Type":"ContainerStarted","Data":"e055abf5019c3a0ae61be0217a8c6aab21f756e63a4899875ea37b28fb25fa89"} Oct 10 08:32:38 crc kubenswrapper[4732]: I1010 08:32:38.379915 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.37990051 podStartE2EDuration="17.37990051s" podCreationTimestamp="2025-10-10 08:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:32:38.378907123 +0000 UTC m=+6085.448498384" watchObservedRunningTime="2025-10-10 08:32:38.37990051 +0000 UTC m=+6085.449491751" Oct 10 08:32:38 crc kubenswrapper[4732]: I1010 08:32:38.836435 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-709f-account-create-gbtdf" Oct 10 08:32:38 crc kubenswrapper[4732]: I1010 08:32:38.937818 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5wsw\" (UniqueName: \"kubernetes.io/projected/69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00-kube-api-access-n5wsw\") pod \"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00\" (UID: \"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00\") " Oct 10 08:32:38 crc kubenswrapper[4732]: I1010 08:32:38.943287 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00-kube-api-access-n5wsw" (OuterVolumeSpecName: "kube-api-access-n5wsw") pod "69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00" (UID: "69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00"). InnerVolumeSpecName "kube-api-access-n5wsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:39 crc kubenswrapper[4732]: I1010 08:32:39.040284 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5wsw\" (UniqueName: \"kubernetes.io/projected/69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00-kube-api-access-n5wsw\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:39 crc kubenswrapper[4732]: I1010 08:32:39.356742 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-709f-account-create-gbtdf" event={"ID":"69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00","Type":"ContainerDied","Data":"12b598125ba40443739dd3d8e61b391e0ed5f5ceefecc2bb835174f3050752c1"} Oct 10 08:32:39 crc kubenswrapper[4732]: I1010 08:32:39.356794 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b598125ba40443739dd3d8e61b391e0ed5f5ceefecc2bb835174f3050752c1" Oct 10 08:32:39 crc kubenswrapper[4732]: I1010 08:32:39.356765 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-709f-account-create-gbtdf" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.019623 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.433845 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-nbjc4"] Oct 10 08:32:41 crc kubenswrapper[4732]: E1010 08:32:41.434761 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00" containerName="mariadb-account-create" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.434780 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00" containerName="mariadb-account-create" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.435041 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00" containerName="mariadb-account-create" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.435984 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.438127 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-jtl65" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.438365 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.443033 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-nbjc4"] Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.443042 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.486838 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-scripts\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.486959 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj7pw\" (UniqueName: \"kubernetes.io/projected/5df0b5c8-3752-4897-aef3-903211609e38-kube-api-access-kj7pw\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.487128 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-config-data\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.487192 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-combined-ca-bundle\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.555156 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.588960 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj7pw\" (UniqueName: \"kubernetes.io/projected/5df0b5c8-3752-4897-aef3-903211609e38-kube-api-access-kj7pw\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.589114 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-config-data\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.589154 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-combined-ca-bundle\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.589223 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-scripts\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.595749 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-scripts\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.597315 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-combined-ca-bundle\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.599919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-config-data\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.605005 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj7pw\" (UniqueName: \"kubernetes.io/projected/5df0b5c8-3752-4897-aef3-903211609e38-kube-api-access-kj7pw\") pod \"aodh-db-sync-nbjc4\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:41 crc kubenswrapper[4732]: I1010 08:32:41.759928 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:42 crc kubenswrapper[4732]: I1010 08:32:42.277342 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-nbjc4"] Oct 10 08:32:42 crc kubenswrapper[4732]: I1010 08:32:42.406878 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nbjc4" event={"ID":"5df0b5c8-3752-4897-aef3-903211609e38","Type":"ContainerStarted","Data":"ca9bd78f8e19d9415c3da9e2ed1b77c488c8e6662d1e3e9756074c2c6b8155f8"} Oct 10 08:32:46 crc kubenswrapper[4732]: I1010 08:32:46.491375 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:32:46 crc kubenswrapper[4732]: I1010 08:32:46.496657 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="aa613f10-ba46-4e24-8785-055e40ac96cb" containerName="kube-state-metrics" containerID="cri-o://40b3998a24e004f183069cac1cb70a9af429501def31e5e9c9774bf841d27889" gracePeriod=30 Oct 10 08:32:47 crc kubenswrapper[4732]: I1010 08:32:47.467036 4732 generic.go:334] "Generic (PLEG): container finished" podID="aa613f10-ba46-4e24-8785-055e40ac96cb" containerID="40b3998a24e004f183069cac1cb70a9af429501def31e5e9c9774bf841d27889" exitCode=2 Oct 10 08:32:47 crc kubenswrapper[4732]: I1010 08:32:47.467190 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"aa613f10-ba46-4e24-8785-055e40ac96cb","Type":"ContainerDied","Data":"40b3998a24e004f183069cac1cb70a9af429501def31e5e9c9774bf841d27889"} Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.399556 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.461483 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qndl9\" (UniqueName: \"kubernetes.io/projected/aa613f10-ba46-4e24-8785-055e40ac96cb-kube-api-access-qndl9\") pod \"aa613f10-ba46-4e24-8785-055e40ac96cb\" (UID: \"aa613f10-ba46-4e24-8785-055e40ac96cb\") " Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.489300 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa613f10-ba46-4e24-8785-055e40ac96cb-kube-api-access-qndl9" (OuterVolumeSpecName: "kube-api-access-qndl9") pod "aa613f10-ba46-4e24-8785-055e40ac96cb" (UID: "aa613f10-ba46-4e24-8785-055e40ac96cb"). InnerVolumeSpecName "kube-api-access-qndl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.514744 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"aa613f10-ba46-4e24-8785-055e40ac96cb","Type":"ContainerDied","Data":"6edf6228ac4eb8d15ea7e2f58e0a2a102657d62cf96b3865ef23c0b1ccae3bd8"} Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.515129 4732 scope.go:117] "RemoveContainer" containerID="40b3998a24e004f183069cac1cb70a9af429501def31e5e9c9774bf841d27889" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.514910 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.563992 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qndl9\" (UniqueName: \"kubernetes.io/projected/aa613f10-ba46-4e24-8785-055e40ac96cb-kube-api-access-qndl9\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.567837 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.578453 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.590777 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:32:48 crc kubenswrapper[4732]: E1010 08:32:48.591274 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa613f10-ba46-4e24-8785-055e40ac96cb" containerName="kube-state-metrics" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.591291 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa613f10-ba46-4e24-8785-055e40ac96cb" containerName="kube-state-metrics" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.591477 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa613f10-ba46-4e24-8785-055e40ac96cb" containerName="kube-state-metrics" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.592239 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.596399 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.597017 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.601875 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.665815 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.665894 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257r4\" (UniqueName: \"kubernetes.io/projected/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-api-access-257r4\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.666114 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.666194 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.675173 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.675456 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-central-agent" containerID="cri-o://8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e" gracePeriod=30 Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.675567 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="proxy-httpd" containerID="cri-o://2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02" gracePeriod=30 Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.675625 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="sg-core" containerID="cri-o://c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643" gracePeriod=30 Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.675638 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-notification-agent" containerID="cri-o://c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6" gracePeriod=30 Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.768528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.768648 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257r4\" (UniqueName: \"kubernetes.io/projected/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-api-access-257r4\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.768740 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.768763 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.774287 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.791247 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.791602 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.863613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257r4\" (UniqueName: \"kubernetes.io/projected/4c4e511b-2d1b-4771-a029-65d752c5728b-kube-api-access-257r4\") pod \"kube-state-metrics-0\" (UID: \"4c4e511b-2d1b-4771-a029-65d752c5728b\") " pod="openstack/kube-state-metrics-0" Oct 10 08:32:48 crc kubenswrapper[4732]: I1010 08:32:48.927328 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.505036 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.529054 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4c4e511b-2d1b-4771-a029-65d752c5728b","Type":"ContainerStarted","Data":"d72636a8143f9bc1941f3e46458eef375e1bd4379f940db3da2f192c2d84b82a"} Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.533572 4732 generic.go:334] "Generic (PLEG): container finished" podID="440be690-fb57-4a22-be01-05a8113a84a6" containerID="2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02" exitCode=0 Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.533652 4732 generic.go:334] "Generic (PLEG): container finished" podID="440be690-fb57-4a22-be01-05a8113a84a6" containerID="c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643" exitCode=2 Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.533662 4732 generic.go:334] "Generic (PLEG): container finished" podID="440be690-fb57-4a22-be01-05a8113a84a6" containerID="8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e" exitCode=0 Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.533795 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerDied","Data":"2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02"} Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.533826 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerDied","Data":"c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643"} Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.533839 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerDied","Data":"8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e"} Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.537271 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nbjc4" event={"ID":"5df0b5c8-3752-4897-aef3-903211609e38","Type":"ContainerStarted","Data":"cded0d67c97d744d406da3bfcf7e4c5699910effa01d42815242887c7f043bc7"} Oct 10 08:32:49 crc kubenswrapper[4732]: I1010 08:32:49.673228 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa613f10-ba46-4e24-8785-055e40ac96cb" path="/var/lib/kubelet/pods/aa613f10-ba46-4e24-8785-055e40ac96cb/volumes" Oct 10 08:32:50 crc kubenswrapper[4732]: I1010 08:32:50.546623 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4c4e511b-2d1b-4771-a029-65d752c5728b","Type":"ContainerStarted","Data":"8e25a01fbe7bb77e6660779941ba8bf3607ece23e05f2088e49d731cc1b65720"} Oct 10 08:32:50 crc kubenswrapper[4732]: I1010 08:32:50.565808 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.135031727 podStartE2EDuration="2.56578392s" podCreationTimestamp="2025-10-10 08:32:48 +0000 UTC" firstStartedPulling="2025-10-10 08:32:49.513891094 +0000 UTC m=+6096.583482335" lastFinishedPulling="2025-10-10 08:32:49.944643287 +0000 UTC m=+6097.014234528" observedRunningTime="2025-10-10 08:32:50.562607225 +0000 UTC m=+6097.632198486" watchObservedRunningTime="2025-10-10 08:32:50.56578392 +0000 UTC m=+6097.635375171" Oct 10 08:32:50 crc kubenswrapper[4732]: I1010 08:32:50.568013 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-nbjc4" podStartSLOduration=3.6479969949999997 podStartE2EDuration="9.56799951s" podCreationTimestamp="2025-10-10 08:32:41 +0000 UTC" firstStartedPulling="2025-10-10 08:32:42.25999721 +0000 UTC m=+6089.329588451" lastFinishedPulling="2025-10-10 08:32:48.179999725 +0000 UTC m=+6095.249590966" observedRunningTime="2025-10-10 08:32:49.555641252 +0000 UTC m=+6096.625232493" watchObservedRunningTime="2025-10-10 08:32:50.56799951 +0000 UTC m=+6097.637590761" Oct 10 08:32:51 crc kubenswrapper[4732]: I1010 08:32:51.555253 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:51 crc kubenswrapper[4732]: I1010 08:32:51.559649 4732 generic.go:334] "Generic (PLEG): container finished" podID="5df0b5c8-3752-4897-aef3-903211609e38" containerID="cded0d67c97d744d406da3bfcf7e4c5699910effa01d42815242887c7f043bc7" exitCode=0 Oct 10 08:32:51 crc kubenswrapper[4732]: I1010 08:32:51.559766 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nbjc4" event={"ID":"5df0b5c8-3752-4897-aef3-903211609e38","Type":"ContainerDied","Data":"cded0d67c97d744d406da3bfcf7e4c5699910effa01d42815242887c7f043bc7"} Oct 10 08:32:51 crc kubenswrapper[4732]: I1010 08:32:51.559819 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 10 08:32:51 crc kubenswrapper[4732]: I1010 08:32:51.564327 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:52 crc kubenswrapper[4732]: I1010 08:32:52.578759 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.138174 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.187173 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-config-data\") pod \"5df0b5c8-3752-4897-aef3-903211609e38\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.187263 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-combined-ca-bundle\") pod \"5df0b5c8-3752-4897-aef3-903211609e38\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.187355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj7pw\" (UniqueName: \"kubernetes.io/projected/5df0b5c8-3752-4897-aef3-903211609e38-kube-api-access-kj7pw\") pod \"5df0b5c8-3752-4897-aef3-903211609e38\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.187399 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-scripts\") pod \"5df0b5c8-3752-4897-aef3-903211609e38\" (UID: \"5df0b5c8-3752-4897-aef3-903211609e38\") " Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.206935 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-scripts" (OuterVolumeSpecName: "scripts") pod "5df0b5c8-3752-4897-aef3-903211609e38" (UID: "5df0b5c8-3752-4897-aef3-903211609e38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.213654 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df0b5c8-3752-4897-aef3-903211609e38-kube-api-access-kj7pw" (OuterVolumeSpecName: "kube-api-access-kj7pw") pod "5df0b5c8-3752-4897-aef3-903211609e38" (UID: "5df0b5c8-3752-4897-aef3-903211609e38"). InnerVolumeSpecName "kube-api-access-kj7pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.224262 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5df0b5c8-3752-4897-aef3-903211609e38" (UID: "5df0b5c8-3752-4897-aef3-903211609e38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.242933 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-config-data" (OuterVolumeSpecName: "config-data") pod "5df0b5c8-3752-4897-aef3-903211609e38" (UID: "5df0b5c8-3752-4897-aef3-903211609e38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.289328 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj7pw\" (UniqueName: \"kubernetes.io/projected/5df0b5c8-3752-4897-aef3-903211609e38-kube-api-access-kj7pw\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.289363 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.289374 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.289383 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5df0b5c8-3752-4897-aef3-903211609e38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.581771 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-nbjc4" Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.582762 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-nbjc4" event={"ID":"5df0b5c8-3752-4897-aef3-903211609e38","Type":"ContainerDied","Data":"ca9bd78f8e19d9415c3da9e2ed1b77c488c8e6662d1e3e9756074c2c6b8155f8"} Oct 10 08:32:53 crc kubenswrapper[4732]: I1010 08:32:53.582839 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9bd78f8e19d9415c3da9e2ed1b77c488c8e6662d1e3e9756074c2c6b8155f8" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.301165 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.411457 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-config-data\") pod \"440be690-fb57-4a22-be01-05a8113a84a6\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.411625 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-combined-ca-bundle\") pod \"440be690-fb57-4a22-be01-05a8113a84a6\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.411675 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-sg-core-conf-yaml\") pod \"440be690-fb57-4a22-be01-05a8113a84a6\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.411743 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-run-httpd\") pod \"440be690-fb57-4a22-be01-05a8113a84a6\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.411764 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-scripts\") pod \"440be690-fb57-4a22-be01-05a8113a84a6\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.411807 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-log-httpd\") pod \"440be690-fb57-4a22-be01-05a8113a84a6\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.411898 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qqx6\" (UniqueName: \"kubernetes.io/projected/440be690-fb57-4a22-be01-05a8113a84a6-kube-api-access-6qqx6\") pod \"440be690-fb57-4a22-be01-05a8113a84a6\" (UID: \"440be690-fb57-4a22-be01-05a8113a84a6\") " Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.413076 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "440be690-fb57-4a22-be01-05a8113a84a6" (UID: "440be690-fb57-4a22-be01-05a8113a84a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.415337 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "440be690-fb57-4a22-be01-05a8113a84a6" (UID: "440be690-fb57-4a22-be01-05a8113a84a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.420534 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-scripts" (OuterVolumeSpecName: "scripts") pod "440be690-fb57-4a22-be01-05a8113a84a6" (UID: "440be690-fb57-4a22-be01-05a8113a84a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.420609 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440be690-fb57-4a22-be01-05a8113a84a6-kube-api-access-6qqx6" (OuterVolumeSpecName: "kube-api-access-6qqx6") pod "440be690-fb57-4a22-be01-05a8113a84a6" (UID: "440be690-fb57-4a22-be01-05a8113a84a6"). InnerVolumeSpecName "kube-api-access-6qqx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.443525 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "440be690-fb57-4a22-be01-05a8113a84a6" (UID: "440be690-fb57-4a22-be01-05a8113a84a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.502809 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "440be690-fb57-4a22-be01-05a8113a84a6" (UID: "440be690-fb57-4a22-be01-05a8113a84a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.513983 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.514022 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.514035 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.514049 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.514061 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/440be690-fb57-4a22-be01-05a8113a84a6-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.514072 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qqx6\" (UniqueName: \"kubernetes.io/projected/440be690-fb57-4a22-be01-05a8113a84a6-kube-api-access-6qqx6\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.515013 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-config-data" (OuterVolumeSpecName: "config-data") pod "440be690-fb57-4a22-be01-05a8113a84a6" (UID: "440be690-fb57-4a22-be01-05a8113a84a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.594068 4732 generic.go:334] "Generic (PLEG): container finished" podID="440be690-fb57-4a22-be01-05a8113a84a6" containerID="c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6" exitCode=0 Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.594126 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerDied","Data":"c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6"} Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.594157 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"440be690-fb57-4a22-be01-05a8113a84a6","Type":"ContainerDied","Data":"2803bff84de7355bc95408b6682b7ba9ab0329bbd10b7cc90d234034de943ce6"} Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.594159 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.594176 4732 scope.go:117] "RemoveContainer" containerID="2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.615533 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/440be690-fb57-4a22-be01-05a8113a84a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.620130 4732 scope.go:117] "RemoveContainer" containerID="c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.636098 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.645222 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.651586 4732 scope.go:117] "RemoveContainer" containerID="c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669037 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.669472 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-central-agent" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669492 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-central-agent" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.669503 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="proxy-httpd" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669509 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="proxy-httpd" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.669520 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-notification-agent" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669527 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-notification-agent" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.669553 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="sg-core" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669558 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="sg-core" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.669578 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df0b5c8-3752-4897-aef3-903211609e38" containerName="aodh-db-sync" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669585 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df0b5c8-3752-4897-aef3-903211609e38" containerName="aodh-db-sync" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669859 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-notification-agent" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669889 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="sg-core" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669908 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="proxy-httpd" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669927 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df0b5c8-3752-4897-aef3-903211609e38" containerName="aodh-db-sync" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.669942 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="440be690-fb57-4a22-be01-05a8113a84a6" containerName="ceilometer-central-agent" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.677757 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.677805 4732 scope.go:117] "RemoveContainer" containerID="8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.677867 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.680854 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.681129 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.681267 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.717141 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmxds\" (UniqueName: \"kubernetes.io/projected/9477cf73-4bb4-4434-944e-c41af27ada51-kube-api-access-nmxds\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.717302 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-log-httpd\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.717518 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-run-httpd\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.717629 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-scripts\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.717666 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.717782 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-config-data\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.717958 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.718039 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.718499 4732 scope.go:117] "RemoveContainer" containerID="2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.718951 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02\": container with ID starting with 2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02 not found: ID does not exist" containerID="2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.718985 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02"} err="failed to get container status \"2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02\": rpc error: code = NotFound desc = could not find container \"2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02\": container with ID starting with 2f9697134cc8224c29f7d11deb47f4e9e651d0c41bb72253a00bd10e93747b02 not found: ID does not exist" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.719011 4732 scope.go:117] "RemoveContainer" containerID="c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.719416 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643\": container with ID starting with c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643 not found: ID does not exist" containerID="c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.719451 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643"} err="failed to get container status \"c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643\": rpc error: code = NotFound desc = could not find container \"c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643\": container with ID starting with c5e4267592f466ab5bf3f22e283bb3f429e891fefc3840f290fec474c93b8643 not found: ID does not exist" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.719471 4732 scope.go:117] "RemoveContainer" containerID="c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.723268 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6\": container with ID starting with c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6 not found: ID does not exist" containerID="c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.723316 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6"} err="failed to get container status \"c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6\": rpc error: code = NotFound desc = could not find container \"c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6\": container with ID starting with c7d1ced4d979f81ea31dbcf05d651878f3e60d3a51770c94469bdff27daa71a6 not found: ID does not exist" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.723346 4732 scope.go:117] "RemoveContainer" containerID="8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e" Oct 10 08:32:54 crc kubenswrapper[4732]: E1010 08:32:54.723788 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e\": container with ID starting with 8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e not found: ID does not exist" containerID="8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.723989 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e"} err="failed to get container status \"8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e\": rpc error: code = NotFound desc = could not find container \"8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e\": container with ID starting with 8ba2e085c9a9214ae29c33cb4805e9c048e0db4f6b7ca8ce1ebd54922187a51e not found: ID does not exist" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823253 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823318 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmxds\" (UniqueName: \"kubernetes.io/projected/9477cf73-4bb4-4434-944e-c41af27ada51-kube-api-access-nmxds\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823425 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-log-httpd\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823480 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-run-httpd\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823529 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-scripts\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823561 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.823599 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-config-data\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.824875 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-log-httpd\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.824887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-run-httpd\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.827680 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.827875 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.828264 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-scripts\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.828361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.829943 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-config-data\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.840470 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmxds\" (UniqueName: \"kubernetes.io/projected/9477cf73-4bb4-4434-944e-c41af27ada51-kube-api-access-nmxds\") pod \"ceilometer-0\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " pod="openstack/ceilometer-0" Oct 10 08:32:54 crc kubenswrapper[4732]: I1010 08:32:54.999668 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:32:55 crc kubenswrapper[4732]: I1010 08:32:55.355811 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:32:55 crc kubenswrapper[4732]: I1010 08:32:55.356095 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:32:55 crc kubenswrapper[4732]: I1010 08:32:55.472084 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:55 crc kubenswrapper[4732]: I1010 08:32:55.613416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerStarted","Data":"6b4e7570a7937ac989691e11c16f962c7219e9559f7fa28b70f979847ae735dd"} Oct 10 08:32:55 crc kubenswrapper[4732]: I1010 08:32:55.672678 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440be690-fb57-4a22-be01-05a8113a84a6" path="/var/lib/kubelet/pods/440be690-fb57-4a22-be01-05a8113a84a6/volumes" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.604292 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.618148 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.624965 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.625053 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-jtl65" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.625330 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.635109 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerStarted","Data":"2428577787c7dedb8e91cb2a77a63f6bb56e54f5081da0429bb0ec3a97052282"} Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.635148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerStarted","Data":"762465b092d705cf8373d57e9fa6cab7014b5df0a3ca38fcf999d3f1482eabb6"} Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.637649 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.665902 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.666024 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7hb\" (UniqueName: \"kubernetes.io/projected/6c227997-a6d0-403f-a0d3-364f813c6f32-kube-api-access-6f7hb\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.666071 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-scripts\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.666103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-config-data\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.768541 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-scripts\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.768590 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-config-data\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.768733 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.768822 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7hb\" (UniqueName: \"kubernetes.io/projected/6c227997-a6d0-403f-a0d3-364f813c6f32-kube-api-access-6f7hb\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.772433 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-scripts\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.774013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.779402 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-config-data\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.786967 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7hb\" (UniqueName: \"kubernetes.io/projected/6c227997-a6d0-403f-a0d3-364f813c6f32-kube-api-access-6f7hb\") pod \"aodh-0\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " pod="openstack/aodh-0" Oct 10 08:32:56 crc kubenswrapper[4732]: I1010 08:32:56.946432 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:32:57 crc kubenswrapper[4732]: I1010 08:32:57.055349 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-j7p9g"] Oct 10 08:32:57 crc kubenswrapper[4732]: I1010 08:32:57.069105 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-j7p9g"] Oct 10 08:32:57 crc kubenswrapper[4732]: I1010 08:32:57.431782 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 10 08:32:57 crc kubenswrapper[4732]: W1010 08:32:57.437070 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c227997_a6d0_403f_a0d3_364f813c6f32.slice/crio-02a9711563b85e49d41892896d5d4841bfd70937e0ef8b06cc84f186c6018785 WatchSource:0}: Error finding container 02a9711563b85e49d41892896d5d4841bfd70937e0ef8b06cc84f186c6018785: Status 404 returned error can't find the container with id 02a9711563b85e49d41892896d5d4841bfd70937e0ef8b06cc84f186c6018785 Oct 10 08:32:57 crc kubenswrapper[4732]: I1010 08:32:57.644408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerStarted","Data":"02a9711563b85e49d41892896d5d4841bfd70937e0ef8b06cc84f186c6018785"} Oct 10 08:32:57 crc kubenswrapper[4732]: I1010 08:32:57.646964 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerStarted","Data":"ce3f08434f8bf189c205b27ed1a0ae2909eb6941cf5ef4c55673b1295d39bab7"} Oct 10 08:32:57 crc kubenswrapper[4732]: I1010 08:32:57.670489 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9208efd3-71f4-4b1d-9973-c27759728a30" path="/var/lib/kubelet/pods/9208efd3-71f4-4b1d-9973-c27759728a30/volumes" Oct 10 08:32:58 crc kubenswrapper[4732]: I1010 08:32:58.662236 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerStarted","Data":"84366756b0f02881702800459e645924bb46f63379426ae4cbc11670ae763dce"} Oct 10 08:32:58 crc kubenswrapper[4732]: I1010 08:32:58.755829 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:32:58 crc kubenswrapper[4732]: I1010 08:32:58.974998 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.426936 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.723451 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerStarted","Data":"ed09c45323fb656075ce0a8645d8aca7e44c5f165f2fac31a6f9d677973d75ab"} Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.765834 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerStarted","Data":"3f16df21745701be24aa1d395f300ff4711c2ee353e63f6a02c0df2b0e9e5439"} Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.766034 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-central-agent" containerID="cri-o://762465b092d705cf8373d57e9fa6cab7014b5df0a3ca38fcf999d3f1482eabb6" gracePeriod=30 Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.766345 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.766749 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="proxy-httpd" containerID="cri-o://3f16df21745701be24aa1d395f300ff4711c2ee353e63f6a02c0df2b0e9e5439" gracePeriod=30 Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.766820 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="sg-core" containerID="cri-o://ce3f08434f8bf189c205b27ed1a0ae2909eb6941cf5ef4c55673b1295d39bab7" gracePeriod=30 Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.766863 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-notification-agent" containerID="cri-o://2428577787c7dedb8e91cb2a77a63f6bb56e54f5081da0429bb0ec3a97052282" gracePeriod=30 Oct 10 08:33:00 crc kubenswrapper[4732]: I1010 08:33:00.794532 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.305172598 podStartE2EDuration="6.794501613s" podCreationTimestamp="2025-10-10 08:32:54 +0000 UTC" firstStartedPulling="2025-10-10 08:32:55.484798161 +0000 UTC m=+6102.554389412" lastFinishedPulling="2025-10-10 08:32:59.974127186 +0000 UTC m=+6107.043718427" observedRunningTime="2025-10-10 08:33:00.791666367 +0000 UTC m=+6107.861257618" watchObservedRunningTime="2025-10-10 08:33:00.794501613 +0000 UTC m=+6107.864092854" Oct 10 08:33:01 crc kubenswrapper[4732]: I1010 08:33:01.789109 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerStarted","Data":"1dd1278df606935a5bc35c51c7fcea8d13210485210ebebbb63e0691262140ad"} Oct 10 08:33:01 crc kubenswrapper[4732]: I1010 08:33:01.796806 4732 generic.go:334] "Generic (PLEG): container finished" podID="9477cf73-4bb4-4434-944e-c41af27ada51" containerID="3f16df21745701be24aa1d395f300ff4711c2ee353e63f6a02c0df2b0e9e5439" exitCode=0 Oct 10 08:33:01 crc kubenswrapper[4732]: I1010 08:33:01.796862 4732 generic.go:334] "Generic (PLEG): container finished" podID="9477cf73-4bb4-4434-944e-c41af27ada51" containerID="ce3f08434f8bf189c205b27ed1a0ae2909eb6941cf5ef4c55673b1295d39bab7" exitCode=2 Oct 10 08:33:01 crc kubenswrapper[4732]: I1010 08:33:01.796872 4732 generic.go:334] "Generic (PLEG): container finished" podID="9477cf73-4bb4-4434-944e-c41af27ada51" containerID="2428577787c7dedb8e91cb2a77a63f6bb56e54f5081da0429bb0ec3a97052282" exitCode=0 Oct 10 08:33:01 crc kubenswrapper[4732]: I1010 08:33:01.796896 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerDied","Data":"3f16df21745701be24aa1d395f300ff4711c2ee353e63f6a02c0df2b0e9e5439"} Oct 10 08:33:01 crc kubenswrapper[4732]: I1010 08:33:01.796930 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerDied","Data":"ce3f08434f8bf189c205b27ed1a0ae2909eb6941cf5ef4c55673b1295d39bab7"} Oct 10 08:33:01 crc kubenswrapper[4732]: I1010 08:33:01.796943 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerDied","Data":"2428577787c7dedb8e91cb2a77a63f6bb56e54f5081da0429bb0ec3a97052282"} Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.825434 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerStarted","Data":"366cbe97961ca2a94aa3e23dee9a33541f7cb465f617e861f8ebd0c56fd65c8d"} Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.825500 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-api" containerID="cri-o://84366756b0f02881702800459e645924bb46f63379426ae4cbc11670ae763dce" gracePeriod=30 Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.828232 4732 generic.go:334] "Generic (PLEG): container finished" podID="9477cf73-4bb4-4434-944e-c41af27ada51" containerID="762465b092d705cf8373d57e9fa6cab7014b5df0a3ca38fcf999d3f1482eabb6" exitCode=0 Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.826020 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-notifier" containerID="cri-o://1dd1278df606935a5bc35c51c7fcea8d13210485210ebebbb63e0691262140ad" gracePeriod=30 Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.826033 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-evaluator" containerID="cri-o://ed09c45323fb656075ce0a8645d8aca7e44c5f165f2fac31a6f9d677973d75ab" gracePeriod=30 Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.828337 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerDied","Data":"762465b092d705cf8373d57e9fa6cab7014b5df0a3ca38fcf999d3f1482eabb6"} Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.826002 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-listener" containerID="cri-o://366cbe97961ca2a94aa3e23dee9a33541f7cb465f617e861f8ebd0c56fd65c8d" gracePeriod=30 Oct 10 08:33:03 crc kubenswrapper[4732]: I1010 08:33:03.854281 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.322267142 podStartE2EDuration="7.854258157s" podCreationTimestamp="2025-10-10 08:32:56 +0000 UTC" firstStartedPulling="2025-10-10 08:32:57.440625253 +0000 UTC m=+6104.510216494" lastFinishedPulling="2025-10-10 08:33:02.972616268 +0000 UTC m=+6110.042207509" observedRunningTime="2025-10-10 08:33:03.846426757 +0000 UTC m=+6110.916018008" watchObservedRunningTime="2025-10-10 08:33:03.854258157 +0000 UTC m=+6110.923849398" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.319250 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437347 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-sg-core-conf-yaml\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437404 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-scripts\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437440 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-combined-ca-bundle\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437504 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-ceilometer-tls-certs\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437590 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-run-httpd\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437619 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-log-httpd\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437646 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-config-data\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.437767 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmxds\" (UniqueName: \"kubernetes.io/projected/9477cf73-4bb4-4434-944e-c41af27ada51-kube-api-access-nmxds\") pod \"9477cf73-4bb4-4434-944e-c41af27ada51\" (UID: \"9477cf73-4bb4-4434-944e-c41af27ada51\") " Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.439901 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.440201 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.443524 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-scripts" (OuterVolumeSpecName: "scripts") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.444616 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9477cf73-4bb4-4434-944e-c41af27ada51-kube-api-access-nmxds" (OuterVolumeSpecName: "kube-api-access-nmxds") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "kube-api-access-nmxds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.476975 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.486974 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.527292 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.541282 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.541321 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.541333 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9477cf73-4bb4-4434-944e-c41af27ada51-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.541344 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmxds\" (UniqueName: \"kubernetes.io/projected/9477cf73-4bb4-4434-944e-c41af27ada51-kube-api-access-nmxds\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.541354 4732 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.541362 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.541371 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.556285 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-config-data" (OuterVolumeSpecName: "config-data") pod "9477cf73-4bb4-4434-944e-c41af27ada51" (UID: "9477cf73-4bb4-4434-944e-c41af27ada51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.643099 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9477cf73-4bb4-4434-944e-c41af27ada51-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.844534 4732 generic.go:334] "Generic (PLEG): container finished" podID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerID="1dd1278df606935a5bc35c51c7fcea8d13210485210ebebbb63e0691262140ad" exitCode=0 Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.844566 4732 generic.go:334] "Generic (PLEG): container finished" podID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerID="ed09c45323fb656075ce0a8645d8aca7e44c5f165f2fac31a6f9d677973d75ab" exitCode=0 Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.844574 4732 generic.go:334] "Generic (PLEG): container finished" podID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerID="84366756b0f02881702800459e645924bb46f63379426ae4cbc11670ae763dce" exitCode=0 Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.845383 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerDied","Data":"1dd1278df606935a5bc35c51c7fcea8d13210485210ebebbb63e0691262140ad"} Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.845517 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerDied","Data":"ed09c45323fb656075ce0a8645d8aca7e44c5f165f2fac31a6f9d677973d75ab"} Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.845532 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerDied","Data":"84366756b0f02881702800459e645924bb46f63379426ae4cbc11670ae763dce"} Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.850202 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9477cf73-4bb4-4434-944e-c41af27ada51","Type":"ContainerDied","Data":"6b4e7570a7937ac989691e11c16f962c7219e9559f7fa28b70f979847ae735dd"} Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.850309 4732 scope.go:117] "RemoveContainer" containerID="3f16df21745701be24aa1d395f300ff4711c2ee353e63f6a02c0df2b0e9e5439" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.850470 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.885686 4732 scope.go:117] "RemoveContainer" containerID="ce3f08434f8bf189c205b27ed1a0ae2909eb6941cf5ef4c55673b1295d39bab7" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.908540 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.919924 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.920099 4732 scope.go:117] "RemoveContainer" containerID="2428577787c7dedb8e91cb2a77a63f6bb56e54f5081da0429bb0ec3a97052282" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.947344 4732 scope.go:117] "RemoveContainer" containerID="762465b092d705cf8373d57e9fa6cab7014b5df0a3ca38fcf999d3f1482eabb6" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948077 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:33:04 crc kubenswrapper[4732]: E1010 08:33:04.948463 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="sg-core" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948480 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="sg-core" Oct 10 08:33:04 crc kubenswrapper[4732]: E1010 08:33:04.948491 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="proxy-httpd" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948498 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="proxy-httpd" Oct 10 08:33:04 crc kubenswrapper[4732]: E1010 08:33:04.948516 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-central-agent" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948522 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-central-agent" Oct 10 08:33:04 crc kubenswrapper[4732]: E1010 08:33:04.948548 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-notification-agent" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948555 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-notification-agent" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948757 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="sg-core" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948768 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-notification-agent" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948780 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="ceilometer-central-agent" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.948793 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" containerName="proxy-httpd" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.950661 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.952799 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-config-data\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.952831 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.952862 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-scripts\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.952891 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.952942 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.953065 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76531d1c-d162-4a95-9764-be79581cd832-run-httpd\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.953166 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76531d1c-d162-4a95-9764-be79581cd832-log-httpd\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.953299 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk7g6\" (UniqueName: \"kubernetes.io/projected/76531d1c-d162-4a95-9764-be79581cd832-kube-api-access-fk7g6\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.955291 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.955426 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.955616 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 10 08:33:04 crc kubenswrapper[4732]: I1010 08:33:04.973126 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.054845 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.054941 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.055039 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76531d1c-d162-4a95-9764-be79581cd832-run-httpd\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.055064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76531d1c-d162-4a95-9764-be79581cd832-log-httpd\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.055094 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk7g6\" (UniqueName: \"kubernetes.io/projected/76531d1c-d162-4a95-9764-be79581cd832-kube-api-access-fk7g6\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.055146 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-config-data\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.055178 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.055415 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-scripts\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.056065 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76531d1c-d162-4a95-9764-be79581cd832-run-httpd\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.056406 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76531d1c-d162-4a95-9764-be79581cd832-log-httpd\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.060055 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.060373 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.061859 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.062179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-config-data\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.064216 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76531d1c-d162-4a95-9764-be79581cd832-scripts\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.075865 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk7g6\" (UniqueName: \"kubernetes.io/projected/76531d1c-d162-4a95-9764-be79581cd832-kube-api-access-fk7g6\") pod \"ceilometer-0\" (UID: \"76531d1c-d162-4a95-9764-be79581cd832\") " pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.273606 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.671343 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9477cf73-4bb4-4434-944e-c41af27ada51" path="/var/lib/kubelet/pods/9477cf73-4bb4-4434-944e-c41af27ada51/volumes" Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.716488 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 10 08:33:05 crc kubenswrapper[4732]: W1010 08:33:05.723532 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76531d1c_d162_4a95_9764_be79581cd832.slice/crio-b3e0afe24718fc4953a59842e1d8d27fb7dfb770f4f2eb7fdf65324ab6d70ba2 WatchSource:0}: Error finding container b3e0afe24718fc4953a59842e1d8d27fb7dfb770f4f2eb7fdf65324ab6d70ba2: Status 404 returned error can't find the container with id b3e0afe24718fc4953a59842e1d8d27fb7dfb770f4f2eb7fdf65324ab6d70ba2 Oct 10 08:33:05 crc kubenswrapper[4732]: I1010 08:33:05.867644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76531d1c-d162-4a95-9764-be79581cd832","Type":"ContainerStarted","Data":"b3e0afe24718fc4953a59842e1d8d27fb7dfb770f4f2eb7fdf65324ab6d70ba2"} Oct 10 08:33:06 crc kubenswrapper[4732]: I1010 08:33:06.889738 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76531d1c-d162-4a95-9764-be79581cd832","Type":"ContainerStarted","Data":"8e1463006d9ad19587c65d0bbba9e15992ce52f93b6797b2d2400dd7c79115ae"} Oct 10 08:33:06 crc kubenswrapper[4732]: I1010 08:33:06.890191 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76531d1c-d162-4a95-9764-be79581cd832","Type":"ContainerStarted","Data":"506b8fad58356baf9d53cbe8711ea6835b6b552e0fc1abf6997a38c1a8ea9ed8"} Oct 10 08:33:07 crc kubenswrapper[4732]: I1010 08:33:07.028456 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-784f-account-create-z8p4z"] Oct 10 08:33:07 crc kubenswrapper[4732]: I1010 08:33:07.038361 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-784f-account-create-z8p4z"] Oct 10 08:33:07 crc kubenswrapper[4732]: I1010 08:33:07.671040 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85dab8cc-da56-49c1-af51-eb0b2a24f3a9" path="/var/lib/kubelet/pods/85dab8cc-da56-49c1-af51-eb0b2a24f3a9/volumes" Oct 10 08:33:07 crc kubenswrapper[4732]: I1010 08:33:07.901441 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76531d1c-d162-4a95-9764-be79581cd832","Type":"ContainerStarted","Data":"ee42c5bb18bc41b2957e15481d01c1623bf211acff23beabffd4fb63fa36c251"} Oct 10 08:33:08 crc kubenswrapper[4732]: I1010 08:33:08.921036 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76531d1c-d162-4a95-9764-be79581cd832","Type":"ContainerStarted","Data":"90eaa44a24036783c589b95daf1de99abbf1a4017ba96c70ce1fc1140637d049"} Oct 10 08:33:08 crc kubenswrapper[4732]: I1010 08:33:08.921579 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 10 08:33:25 crc kubenswrapper[4732]: I1010 08:33:25.356447 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:33:25 crc kubenswrapper[4732]: I1010 08:33:25.356978 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:33:25 crc kubenswrapper[4732]: I1010 08:33:25.357028 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:33:25 crc kubenswrapper[4732]: I1010 08:33:25.357855 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf642036695cfd106f68d43cb38c2eaa4d9337092959ec0ecee4949724bd8c41"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:33:25 crc kubenswrapper[4732]: I1010 08:33:25.357917 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://bf642036695cfd106f68d43cb38c2eaa4d9337092959ec0ecee4949724bd8c41" gracePeriod=600 Oct 10 08:33:26 crc kubenswrapper[4732]: I1010 08:33:26.080730 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"bf642036695cfd106f68d43cb38c2eaa4d9337092959ec0ecee4949724bd8c41"} Oct 10 08:33:26 crc kubenswrapper[4732]: I1010 08:33:26.080764 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="bf642036695cfd106f68d43cb38c2eaa4d9337092959ec0ecee4949724bd8c41" exitCode=0 Oct 10 08:33:26 crc kubenswrapper[4732]: I1010 08:33:26.081101 4732 scope.go:117] "RemoveContainer" containerID="6ef4b5fd9898a52af76e502a2541a7cc147f3d8022eccbdadde3b5fb91828ef1" Oct 10 08:33:26 crc kubenswrapper[4732]: I1010 08:33:26.081117 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff"} Oct 10 08:33:26 crc kubenswrapper[4732]: I1010 08:33:26.104680 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=19.511657488 podStartE2EDuration="22.104659283s" podCreationTimestamp="2025-10-10 08:33:04 +0000 UTC" firstStartedPulling="2025-10-10 08:33:05.726387018 +0000 UTC m=+6112.795978259" lastFinishedPulling="2025-10-10 08:33:08.319388823 +0000 UTC m=+6115.388980054" observedRunningTime="2025-10-10 08:33:08.970119188 +0000 UTC m=+6116.039710449" watchObservedRunningTime="2025-10-10 08:33:26.104659283 +0000 UTC m=+6133.174250534" Oct 10 08:33:31 crc kubenswrapper[4732]: I1010 08:33:31.037050 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zthxv"] Oct 10 08:33:31 crc kubenswrapper[4732]: I1010 08:33:31.046608 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zthxv"] Oct 10 08:33:31 crc kubenswrapper[4732]: I1010 08:33:31.671295 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17871c51-5f9e-46b9-975e-8fd40a25c9df" path="/var/lib/kubelet/pods/17871c51-5f9e-46b9-975e-8fd40a25c9df/volumes" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.186345 4732 generic.go:334] "Generic (PLEG): container finished" podID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerID="366cbe97961ca2a94aa3e23dee9a33541f7cb465f617e861f8ebd0c56fd65c8d" exitCode=137 Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.186634 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerDied","Data":"366cbe97961ca2a94aa3e23dee9a33541f7cb465f617e861f8ebd0c56fd65c8d"} Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.481205 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.572972 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-scripts\") pod \"6c227997-a6d0-403f-a0d3-364f813c6f32\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.573020 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-combined-ca-bundle\") pod \"6c227997-a6d0-403f-a0d3-364f813c6f32\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.573178 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f7hb\" (UniqueName: \"kubernetes.io/projected/6c227997-a6d0-403f-a0d3-364f813c6f32-kube-api-access-6f7hb\") pod \"6c227997-a6d0-403f-a0d3-364f813c6f32\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.573304 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-config-data\") pod \"6c227997-a6d0-403f-a0d3-364f813c6f32\" (UID: \"6c227997-a6d0-403f-a0d3-364f813c6f32\") " Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.579340 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-scripts" (OuterVolumeSpecName: "scripts") pod "6c227997-a6d0-403f-a0d3-364f813c6f32" (UID: "6c227997-a6d0-403f-a0d3-364f813c6f32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.581438 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c227997-a6d0-403f-a0d3-364f813c6f32-kube-api-access-6f7hb" (OuterVolumeSpecName: "kube-api-access-6f7hb") pod "6c227997-a6d0-403f-a0d3-364f813c6f32" (UID: "6c227997-a6d0-403f-a0d3-364f813c6f32"). InnerVolumeSpecName "kube-api-access-6f7hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.675295 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f7hb\" (UniqueName: \"kubernetes.io/projected/6c227997-a6d0-403f-a0d3-364f813c6f32-kube-api-access-6f7hb\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.675329 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-scripts\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.701164 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-config-data" (OuterVolumeSpecName: "config-data") pod "6c227997-a6d0-403f-a0d3-364f813c6f32" (UID: "6c227997-a6d0-403f-a0d3-364f813c6f32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.711267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c227997-a6d0-403f-a0d3-364f813c6f32" (UID: "6c227997-a6d0-403f-a0d3-364f813c6f32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.780039 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:34 crc kubenswrapper[4732]: I1010 08:33:34.780076 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c227997-a6d0-403f-a0d3-364f813c6f32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.198469 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6c227997-a6d0-403f-a0d3-364f813c6f32","Type":"ContainerDied","Data":"02a9711563b85e49d41892896d5d4841bfd70937e0ef8b06cc84f186c6018785"} Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.198763 4732 scope.go:117] "RemoveContainer" containerID="366cbe97961ca2a94aa3e23dee9a33541f7cb465f617e861f8ebd0c56fd65c8d" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.198543 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.230792 4732 scope.go:117] "RemoveContainer" containerID="1dd1278df606935a5bc35c51c7fcea8d13210485210ebebbb63e0691262140ad" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.248519 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.258939 4732 scope.go:117] "RemoveContainer" containerID="ed09c45323fb656075ce0a8645d8aca7e44c5f165f2fac31a6f9d677973d75ab" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.259154 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.276621 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 10 08:33:35 crc kubenswrapper[4732]: E1010 08:33:35.277132 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-api" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.277151 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-api" Oct 10 08:33:35 crc kubenswrapper[4732]: E1010 08:33:35.277800 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-listener" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.277811 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-listener" Oct 10 08:33:35 crc kubenswrapper[4732]: E1010 08:33:35.277831 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-evaluator" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.277838 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-evaluator" Oct 10 08:33:35 crc kubenswrapper[4732]: E1010 08:33:35.277850 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-notifier" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.277857 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-notifier" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.278083 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-api" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.278110 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-notifier" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.278122 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-listener" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.278136 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" containerName="aodh-evaluator" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.280319 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.283298 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.283556 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.287734 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.287861 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-public-tls-certs\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.287915 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.287907 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-scripts\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.288057 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-config-data\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.288159 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxtkh\" (UniqueName: \"kubernetes.io/projected/f9b41169-6ef6-4089-a8d2-b528da4862e9-kube-api-access-bxtkh\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.288233 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.288341 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-internal-tls-certs\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.293740 4732 scope.go:117] "RemoveContainer" containerID="84366756b0f02881702800459e645924bb46f63379426ae4cbc11670ae763dce" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.294063 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-jtl65" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.303761 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.308746 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.389307 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-internal-tls-certs\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.389443 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-public-tls-certs\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.389473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-scripts\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.389541 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-config-data\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.389585 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxtkh\" (UniqueName: \"kubernetes.io/projected/f9b41169-6ef6-4089-a8d2-b528da4862e9-kube-api-access-bxtkh\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.389653 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.396463 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-internal-tls-certs\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.398012 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-public-tls-certs\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.398890 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-config-data\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.399532 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-combined-ca-bundle\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.401352 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9b41169-6ef6-4089-a8d2-b528da4862e9-scripts\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.415723 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxtkh\" (UniqueName: \"kubernetes.io/projected/f9b41169-6ef6-4089-a8d2-b528da4862e9-kube-api-access-bxtkh\") pod \"aodh-0\" (UID: \"f9b41169-6ef6-4089-a8d2-b528da4862e9\") " pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.564287 4732 scope.go:117] "RemoveContainer" containerID="4c4823b6931c0d2801aa4703bd727a3689261d41b68e4b197c5e33dbfbf0b3fd" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.597949 4732 scope.go:117] "RemoveContainer" containerID="22194aef04bf1366955ade452fe2be591b4cdbf3cb21d9d2f2f05f37e156282e" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.611878 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.643408 4732 scope.go:117] "RemoveContainer" containerID="0ab9c90065762eb79824bd5b2850939f2fa691e047b43cf96227aee6c59de592" Oct 10 08:33:35 crc kubenswrapper[4732]: I1010 08:33:35.673794 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c227997-a6d0-403f-a0d3-364f813c6f32" path="/var/lib/kubelet/pods/6c227997-a6d0-403f-a0d3-364f813c6f32/volumes" Oct 10 08:33:36 crc kubenswrapper[4732]: I1010 08:33:36.186289 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 10 08:33:36 crc kubenswrapper[4732]: W1010 08:33:36.192824 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b41169_6ef6_4089_a8d2_b528da4862e9.slice/crio-2a1dd53b2da2799097ad3d3d0534a1e125d52c64c44837af06ffb8d2c84e9aef WatchSource:0}: Error finding container 2a1dd53b2da2799097ad3d3d0534a1e125d52c64c44837af06ffb8d2c84e9aef: Status 404 returned error can't find the container with id 2a1dd53b2da2799097ad3d3d0534a1e125d52c64c44837af06ffb8d2c84e9aef Oct 10 08:33:36 crc kubenswrapper[4732]: I1010 08:33:36.209064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f9b41169-6ef6-4089-a8d2-b528da4862e9","Type":"ContainerStarted","Data":"2a1dd53b2da2799097ad3d3d0534a1e125d52c64c44837af06ffb8d2c84e9aef"} Oct 10 08:33:37 crc kubenswrapper[4732]: I1010 08:33:37.221169 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f9b41169-6ef6-4089-a8d2-b528da4862e9","Type":"ContainerStarted","Data":"630df1e0b358109cc9e3471c8f9e75cd08b2930dab5551edcae24f38baea06b8"} Oct 10 08:33:37 crc kubenswrapper[4732]: I1010 08:33:37.221568 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f9b41169-6ef6-4089-a8d2-b528da4862e9","Type":"ContainerStarted","Data":"4c4cbe8cb055988061997d6fb3b77f1130a3b451d4b93ebd50f78db4c7f49832"} Oct 10 08:33:38 crc kubenswrapper[4732]: I1010 08:33:38.246382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f9b41169-6ef6-4089-a8d2-b528da4862e9","Type":"ContainerStarted","Data":"6a7812286f3adcc822a65d237b861d77b6fac7d418bdb004466b767b3b4cb498"} Oct 10 08:33:38 crc kubenswrapper[4732]: I1010 08:33:38.246702 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"f9b41169-6ef6-4089-a8d2-b528da4862e9","Type":"ContainerStarted","Data":"144919a18550b4fe383c9b6416ace4483e88942bcf8d64a6ed33523f820f3264"} Oct 10 08:33:38 crc kubenswrapper[4732]: I1010 08:33:38.301619 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.915965816 podStartE2EDuration="3.301598171s" podCreationTimestamp="2025-10-10 08:33:35 +0000 UTC" firstStartedPulling="2025-10-10 08:33:36.196311045 +0000 UTC m=+6143.265902286" lastFinishedPulling="2025-10-10 08:33:37.5819434 +0000 UTC m=+6144.651534641" observedRunningTime="2025-10-10 08:33:38.289282611 +0000 UTC m=+6145.358873862" watchObservedRunningTime="2025-10-10 08:33:38.301598171 +0000 UTC m=+6145.371189412" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.847838 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b67b9655c-f7t5b"] Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.850475 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.861531 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.861559 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b67b9655c-f7t5b"] Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.902650 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-sb\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.902729 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-dns-svc\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.902770 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-config\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.902810 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqq98\" (UniqueName: \"kubernetes.io/projected/9f5dc646-8801-494b-a9d8-ab1ddb930b28-kube-api-access-tqq98\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.902901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-nb\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:40 crc kubenswrapper[4732]: I1010 08:33:40.902955 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-openstack-cell1\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.005614 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-config\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.005816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqq98\" (UniqueName: \"kubernetes.io/projected/9f5dc646-8801-494b-a9d8-ab1ddb930b28-kube-api-access-tqq98\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.005984 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-nb\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.006075 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-openstack-cell1\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.006319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-sb\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.006369 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-dns-svc\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.006910 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-openstack-cell1\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.006931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-config\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.007001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-nb\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.007010 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-sb\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.007525 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-dns-svc\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.023561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqq98\" (UniqueName: \"kubernetes.io/projected/9f5dc646-8801-494b-a9d8-ab1ddb930b28-kube-api-access-tqq98\") pod \"dnsmasq-dns-5b67b9655c-f7t5b\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.171231 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:41 crc kubenswrapper[4732]: I1010 08:33:41.643579 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b67b9655c-f7t5b"] Oct 10 08:33:42 crc kubenswrapper[4732]: I1010 08:33:42.301741 4732 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerID="e866a582388f3bb611f09620f0ec81d3a46aabc433451b9110e2d48e73968703" exitCode=0 Oct 10 08:33:42 crc kubenswrapper[4732]: I1010 08:33:42.301997 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" event={"ID":"9f5dc646-8801-494b-a9d8-ab1ddb930b28","Type":"ContainerDied","Data":"e866a582388f3bb611f09620f0ec81d3a46aabc433451b9110e2d48e73968703"} Oct 10 08:33:42 crc kubenswrapper[4732]: I1010 08:33:42.302030 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" event={"ID":"9f5dc646-8801-494b-a9d8-ab1ddb930b28","Type":"ContainerStarted","Data":"bc0e8bcea41aafdb30f1de7753f461c92ac5fe84bc64da1ecc423c961fd2e906"} Oct 10 08:33:43 crc kubenswrapper[4732]: I1010 08:33:43.313252 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" event={"ID":"9f5dc646-8801-494b-a9d8-ab1ddb930b28","Type":"ContainerStarted","Data":"a4a90fb91b9bd6cd3cdc7ace0289d3e8ee8d5240d1fece1347b6063f0953cf52"} Oct 10 08:33:43 crc kubenswrapper[4732]: I1010 08:33:43.313817 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:43 crc kubenswrapper[4732]: I1010 08:33:43.343071 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" podStartSLOduration=3.343053758 podStartE2EDuration="3.343053758s" podCreationTimestamp="2025-10-10 08:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:33:43.334383826 +0000 UTC m=+6150.403975087" watchObservedRunningTime="2025-10-10 08:33:43.343053758 +0000 UTC m=+6150.412644999" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.172843 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.252703 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65fd6b8f6f-bdkjv"] Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.252921 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" podUID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerName="dnsmasq-dns" containerID="cri-o://9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230" gracePeriod=10 Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.603474 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8ff6dbf6c-6kbs5"] Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.605991 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.623918 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8ff6dbf6c-6kbs5"] Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.721050 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-ovsdbserver-sb\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.721476 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-config\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.721638 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-ovsdbserver-nb\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.721806 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-dns-svc\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.721907 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-openstack-cell1\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.722030 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wglmj\" (UniqueName: \"kubernetes.io/projected/000c5dfc-e5f4-49de-858d-d8a01e3acebc-kube-api-access-wglmj\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.827228 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-ovsdbserver-sb\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.827578 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-config\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.827623 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-ovsdbserver-nb\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.827662 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-dns-svc\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.827711 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-openstack-cell1\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.827765 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wglmj\" (UniqueName: \"kubernetes.io/projected/000c5dfc-e5f4-49de-858d-d8a01e3acebc-kube-api-access-wglmj\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.828412 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-ovsdbserver-sb\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.828845 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-ovsdbserver-nb\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.829288 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-openstack-cell1\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.829428 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-config\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.829554 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/000c5dfc-e5f4-49de-858d-d8a01e3acebc-dns-svc\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.867741 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wglmj\" (UniqueName: \"kubernetes.io/projected/000c5dfc-e5f4-49de-858d-d8a01e3acebc-kube-api-access-wglmj\") pod \"dnsmasq-dns-8ff6dbf6c-6kbs5\" (UID: \"000c5dfc-e5f4-49de-858d-d8a01e3acebc\") " pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:51 crc kubenswrapper[4732]: I1010 08:33:51.934007 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.134018 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.162408 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-dns-svc\") pod \"cfb3b72c-df51-4d81-9318-98f6e1393879\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.162557 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-config\") pod \"cfb3b72c-df51-4d81-9318-98f6e1393879\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.162608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/cfb3b72c-df51-4d81-9318-98f6e1393879-kube-api-access-ff5md\") pod \"cfb3b72c-df51-4d81-9318-98f6e1393879\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.162680 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-nb\") pod \"cfb3b72c-df51-4d81-9318-98f6e1393879\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.162730 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-sb\") pod \"cfb3b72c-df51-4d81-9318-98f6e1393879\" (UID: \"cfb3b72c-df51-4d81-9318-98f6e1393879\") " Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.174340 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb3b72c-df51-4d81-9318-98f6e1393879-kube-api-access-ff5md" (OuterVolumeSpecName: "kube-api-access-ff5md") pod "cfb3b72c-df51-4d81-9318-98f6e1393879" (UID: "cfb3b72c-df51-4d81-9318-98f6e1393879"). InnerVolumeSpecName "kube-api-access-ff5md". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.224483 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfb3b72c-df51-4d81-9318-98f6e1393879" (UID: "cfb3b72c-df51-4d81-9318-98f6e1393879"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.274331 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfb3b72c-df51-4d81-9318-98f6e1393879" (UID: "cfb3b72c-df51-4d81-9318-98f6e1393879"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.274819 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-config" (OuterVolumeSpecName: "config") pod "cfb3b72c-df51-4d81-9318-98f6e1393879" (UID: "cfb3b72c-df51-4d81-9318-98f6e1393879"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.275871 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfb3b72c-df51-4d81-9318-98f6e1393879" (UID: "cfb3b72c-df51-4d81-9318-98f6e1393879"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.276159 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff5md\" (UniqueName: \"kubernetes.io/projected/cfb3b72c-df51-4d81-9318-98f6e1393879-kube-api-access-ff5md\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.276179 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.378068 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.378297 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.378311 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfb3b72c-df51-4d81-9318-98f6e1393879-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.414616 4732 generic.go:334] "Generic (PLEG): container finished" podID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerID="9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230" exitCode=0 Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.414664 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.414669 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" event={"ID":"cfb3b72c-df51-4d81-9318-98f6e1393879","Type":"ContainerDied","Data":"9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230"} Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.414741 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fd6b8f6f-bdkjv" event={"ID":"cfb3b72c-df51-4d81-9318-98f6e1393879","Type":"ContainerDied","Data":"394c11ab6623a5fd0b3ab7b8296138bc1ef1fdc19b0cdab6fc16e5c318a12148"} Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.414766 4732 scope.go:117] "RemoveContainer" containerID="9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.445319 4732 scope.go:117] "RemoveContainer" containerID="10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.452492 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65fd6b8f6f-bdkjv"] Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.462424 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65fd6b8f6f-bdkjv"] Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.481658 4732 scope.go:117] "RemoveContainer" containerID="9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230" Oct 10 08:33:52 crc kubenswrapper[4732]: E1010 08:33:52.482279 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230\": container with ID starting with 9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230 not found: ID does not exist" containerID="9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.482354 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230"} err="failed to get container status \"9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230\": rpc error: code = NotFound desc = could not find container \"9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230\": container with ID starting with 9e6d5642d67dd62ffb0529d0b7013ae36ef3767735a277b02e4f80a232565230 not found: ID does not exist" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.482394 4732 scope.go:117] "RemoveContainer" containerID="10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227" Oct 10 08:33:52 crc kubenswrapper[4732]: E1010 08:33:52.483964 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227\": container with ID starting with 10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227 not found: ID does not exist" containerID="10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.484008 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227"} err="failed to get container status \"10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227\": rpc error: code = NotFound desc = could not find container \"10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227\": container with ID starting with 10419863dd91628296ab82b9e536e25849897b5e76ed7410f0e5101319425227 not found: ID does not exist" Oct 10 08:33:52 crc kubenswrapper[4732]: I1010 08:33:52.570982 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8ff6dbf6c-6kbs5"] Oct 10 08:33:52 crc kubenswrapper[4732]: W1010 08:33:52.577211 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod000c5dfc_e5f4_49de_858d_d8a01e3acebc.slice/crio-453095f9e3377d3def3e3dafef0b70f02aaeb857d618048467358805861e16a2 WatchSource:0}: Error finding container 453095f9e3377d3def3e3dafef0b70f02aaeb857d618048467358805861e16a2: Status 404 returned error can't find the container with id 453095f9e3377d3def3e3dafef0b70f02aaeb857d618048467358805861e16a2 Oct 10 08:33:53 crc kubenswrapper[4732]: I1010 08:33:53.424348 4732 generic.go:334] "Generic (PLEG): container finished" podID="000c5dfc-e5f4-49de-858d-d8a01e3acebc" containerID="8a5348f52cc00233e161f7727c39f5cf6b9878bd9f73f53e5b7e5d6e1d9fb7c6" exitCode=0 Oct 10 08:33:53 crc kubenswrapper[4732]: I1010 08:33:53.424429 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" event={"ID":"000c5dfc-e5f4-49de-858d-d8a01e3acebc","Type":"ContainerDied","Data":"8a5348f52cc00233e161f7727c39f5cf6b9878bd9f73f53e5b7e5d6e1d9fb7c6"} Oct 10 08:33:53 crc kubenswrapper[4732]: I1010 08:33:53.424910 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" event={"ID":"000c5dfc-e5f4-49de-858d-d8a01e3acebc","Type":"ContainerStarted","Data":"453095f9e3377d3def3e3dafef0b70f02aaeb857d618048467358805861e16a2"} Oct 10 08:33:53 crc kubenswrapper[4732]: I1010 08:33:53.688469 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb3b72c-df51-4d81-9318-98f6e1393879" path="/var/lib/kubelet/pods/cfb3b72c-df51-4d81-9318-98f6e1393879/volumes" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.218012 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g7627"] Oct 10 08:33:54 crc kubenswrapper[4732]: E1010 08:33:54.219494 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerName="init" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.219531 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerName="init" Oct 10 08:33:54 crc kubenswrapper[4732]: E1010 08:33:54.219560 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerName="dnsmasq-dns" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.219572 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerName="dnsmasq-dns" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.219818 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb3b72c-df51-4d81-9318-98f6e1393879" containerName="dnsmasq-dns" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.221617 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.229369 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7627"] Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.244887 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-utilities\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.244972 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-catalog-content\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.245061 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxfqz\" (UniqueName: \"kubernetes.io/projected/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-kube-api-access-nxfqz\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.346622 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-utilities\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.346729 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-catalog-content\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.346769 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxfqz\" (UniqueName: \"kubernetes.io/projected/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-kube-api-access-nxfqz\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.347547 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-utilities\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.347719 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-catalog-content\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.365930 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxfqz\" (UniqueName: \"kubernetes.io/projected/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-kube-api-access-nxfqz\") pod \"community-operators-g7627\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.435522 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" event={"ID":"000c5dfc-e5f4-49de-858d-d8a01e3acebc","Type":"ContainerStarted","Data":"65771e3c682e36daad764e0915a522fa310e19732c61a36c8f2b68fbdfc002fd"} Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.435846 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.463082 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" podStartSLOduration=3.463066867 podStartE2EDuration="3.463066867s" podCreationTimestamp="2025-10-10 08:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:33:54.453548112 +0000 UTC m=+6161.523139353" watchObservedRunningTime="2025-10-10 08:33:54.463066867 +0000 UTC m=+6161.532658108" Oct 10 08:33:54 crc kubenswrapper[4732]: I1010 08:33:54.553865 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:33:55 crc kubenswrapper[4732]: I1010 08:33:55.113710 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g7627"] Oct 10 08:33:55 crc kubenswrapper[4732]: I1010 08:33:55.464315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerStarted","Data":"202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183"} Oct 10 08:33:55 crc kubenswrapper[4732]: I1010 08:33:55.465933 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerStarted","Data":"52ff3c6889baf7a27efe971e479f750e565e37557bcb5d64a6a1b8551b7795a9"} Oct 10 08:33:56 crc kubenswrapper[4732]: I1010 08:33:56.473962 4732 generic.go:334] "Generic (PLEG): container finished" podID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerID="202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183" exitCode=0 Oct 10 08:33:56 crc kubenswrapper[4732]: I1010 08:33:56.474007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerDied","Data":"202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183"} Oct 10 08:33:56 crc kubenswrapper[4732]: I1010 08:33:56.474335 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerStarted","Data":"6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592"} Oct 10 08:33:58 crc kubenswrapper[4732]: I1010 08:33:58.501030 4732 generic.go:334] "Generic (PLEG): container finished" podID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerID="6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592" exitCode=0 Oct 10 08:33:58 crc kubenswrapper[4732]: I1010 08:33:58.501118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerDied","Data":"6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592"} Oct 10 08:33:59 crc kubenswrapper[4732]: I1010 08:33:59.513319 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerStarted","Data":"7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a"} Oct 10 08:33:59 crc kubenswrapper[4732]: I1010 08:33:59.542728 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g7627" podStartSLOduration=2.067540741 podStartE2EDuration="5.542684348s" podCreationTimestamp="2025-10-10 08:33:54 +0000 UTC" firstStartedPulling="2025-10-10 08:33:55.464936705 +0000 UTC m=+6162.534527946" lastFinishedPulling="2025-10-10 08:33:58.940080312 +0000 UTC m=+6166.009671553" observedRunningTime="2025-10-10 08:33:59.536043381 +0000 UTC m=+6166.605634642" watchObservedRunningTime="2025-10-10 08:33:59.542684348 +0000 UTC m=+6166.612275609" Oct 10 08:34:01 crc kubenswrapper[4732]: I1010 08:34:01.935870 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8ff6dbf6c-6kbs5" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.011439 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b67b9655c-f7t5b"] Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.011663 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" podUID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerName="dnsmasq-dns" containerID="cri-o://a4a90fb91b9bd6cd3cdc7ace0289d3e8ee8d5240d1fece1347b6063f0953cf52" gracePeriod=10 Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.540819 4732 generic.go:334] "Generic (PLEG): container finished" podID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerID="a4a90fb91b9bd6cd3cdc7ace0289d3e8ee8d5240d1fece1347b6063f0953cf52" exitCode=0 Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.540911 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" event={"ID":"9f5dc646-8801-494b-a9d8-ab1ddb930b28","Type":"ContainerDied","Data":"a4a90fb91b9bd6cd3cdc7ace0289d3e8ee8d5240d1fece1347b6063f0953cf52"} Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.541181 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" event={"ID":"9f5dc646-8801-494b-a9d8-ab1ddb930b28","Type":"ContainerDied","Data":"bc0e8bcea41aafdb30f1de7753f461c92ac5fe84bc64da1ecc423c961fd2e906"} Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.541202 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0e8bcea41aafdb30f1de7753f461c92ac5fe84bc64da1ecc423c961fd2e906" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.566496 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.612563 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqq98\" (UniqueName: \"kubernetes.io/projected/9f5dc646-8801-494b-a9d8-ab1ddb930b28-kube-api-access-tqq98\") pod \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.612651 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-nb\") pod \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.612753 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-openstack-cell1\") pod \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.613808 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-dns-svc\") pod \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.613893 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-sb\") pod \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.613953 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-config\") pod \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\" (UID: \"9f5dc646-8801-494b-a9d8-ab1ddb930b28\") " Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.619508 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5dc646-8801-494b-a9d8-ab1ddb930b28-kube-api-access-tqq98" (OuterVolumeSpecName: "kube-api-access-tqq98") pod "9f5dc646-8801-494b-a9d8-ab1ddb930b28" (UID: "9f5dc646-8801-494b-a9d8-ab1ddb930b28"). InnerVolumeSpecName "kube-api-access-tqq98". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.683087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f5dc646-8801-494b-a9d8-ab1ddb930b28" (UID: "9f5dc646-8801-494b-a9d8-ab1ddb930b28"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.683088 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-config" (OuterVolumeSpecName: "config") pod "9f5dc646-8801-494b-a9d8-ab1ddb930b28" (UID: "9f5dc646-8801-494b-a9d8-ab1ddb930b28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.683618 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f5dc646-8801-494b-a9d8-ab1ddb930b28" (UID: "9f5dc646-8801-494b-a9d8-ab1ddb930b28"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.693806 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "9f5dc646-8801-494b-a9d8-ab1ddb930b28" (UID: "9f5dc646-8801-494b-a9d8-ab1ddb930b28"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.706657 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f5dc646-8801-494b-a9d8-ab1ddb930b28" (UID: "9f5dc646-8801-494b-a9d8-ab1ddb930b28"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.717244 4732 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.717276 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.717287 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-config\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.717298 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqq98\" (UniqueName: \"kubernetes.io/projected/9f5dc646-8801-494b-a9d8-ab1ddb930b28-kube-api-access-tqq98\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.717307 4732 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:02 crc kubenswrapper[4732]: I1010 08:34:02.717315 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9f5dc646-8801-494b-a9d8-ab1ddb930b28-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:03 crc kubenswrapper[4732]: I1010 08:34:03.038619 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-bnt72"] Oct 10 08:34:03 crc kubenswrapper[4732]: I1010 08:34:03.047455 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-bnt72"] Oct 10 08:34:03 crc kubenswrapper[4732]: I1010 08:34:03.548730 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b67b9655c-f7t5b" Oct 10 08:34:03 crc kubenswrapper[4732]: I1010 08:34:03.588280 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b67b9655c-f7t5b"] Oct 10 08:34:03 crc kubenswrapper[4732]: I1010 08:34:03.600472 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b67b9655c-f7t5b"] Oct 10 08:34:03 crc kubenswrapper[4732]: I1010 08:34:03.672869 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce602c1-4d4d-40ce-8712-c0a621e288b0" path="/var/lib/kubelet/pods/2ce602c1-4d4d-40ce-8712-c0a621e288b0/volumes" Oct 10 08:34:03 crc kubenswrapper[4732]: I1010 08:34:03.673584 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" path="/var/lib/kubelet/pods/9f5dc646-8801-494b-a9d8-ab1ddb930b28/volumes" Oct 10 08:34:04 crc kubenswrapper[4732]: I1010 08:34:04.553983 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:34:04 crc kubenswrapper[4732]: I1010 08:34:04.557465 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:34:04 crc kubenswrapper[4732]: I1010 08:34:04.625480 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:34:05 crc kubenswrapper[4732]: I1010 08:34:05.615032 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:34:05 crc kubenswrapper[4732]: I1010 08:34:05.671272 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7627"] Oct 10 08:34:07 crc kubenswrapper[4732]: I1010 08:34:07.587852 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g7627" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="registry-server" containerID="cri-o://7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a" gracePeriod=2 Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.112047 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.234122 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxfqz\" (UniqueName: \"kubernetes.io/projected/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-kube-api-access-nxfqz\") pod \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.234239 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-catalog-content\") pod \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.234293 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-utilities\") pod \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\" (UID: \"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0\") " Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.235457 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-utilities" (OuterVolumeSpecName: "utilities") pod "ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" (UID: "ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.241661 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-kube-api-access-nxfqz" (OuterVolumeSpecName: "kube-api-access-nxfqz") pod "ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" (UID: "ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0"). InnerVolumeSpecName "kube-api-access-nxfqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.285195 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" (UID: "ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.336883 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxfqz\" (UniqueName: \"kubernetes.io/projected/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-kube-api-access-nxfqz\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.336915 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.336926 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.598236 4732 generic.go:334] "Generic (PLEG): container finished" podID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerID="7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a" exitCode=0 Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.598276 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerDied","Data":"7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a"} Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.598299 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g7627" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.598320 4732 scope.go:117] "RemoveContainer" containerID="7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.598308 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g7627" event={"ID":"ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0","Type":"ContainerDied","Data":"52ff3c6889baf7a27efe971e479f750e565e37557bcb5d64a6a1b8551b7795a9"} Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.618533 4732 scope.go:117] "RemoveContainer" containerID="6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.639259 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g7627"] Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.651329 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g7627"] Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.654334 4732 scope.go:117] "RemoveContainer" containerID="202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.694322 4732 scope.go:117] "RemoveContainer" containerID="7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a" Oct 10 08:34:08 crc kubenswrapper[4732]: E1010 08:34:08.694859 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a\": container with ID starting with 7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a not found: ID does not exist" containerID="7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.694900 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a"} err="failed to get container status \"7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a\": rpc error: code = NotFound desc = could not find container \"7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a\": container with ID starting with 7cbfc4ea9452264ab03f8376154bdab1aa8060e65b3c46f4576ecdefb8cf2a8a not found: ID does not exist" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.694930 4732 scope.go:117] "RemoveContainer" containerID="6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592" Oct 10 08:34:08 crc kubenswrapper[4732]: E1010 08:34:08.695320 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592\": container with ID starting with 6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592 not found: ID does not exist" containerID="6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.695351 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592"} err="failed to get container status \"6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592\": rpc error: code = NotFound desc = could not find container \"6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592\": container with ID starting with 6d13351018a0bade28c1c8ad6cc5e16f757cac3e7d238f6337f0512beeede592 not found: ID does not exist" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.695376 4732 scope.go:117] "RemoveContainer" containerID="202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183" Oct 10 08:34:08 crc kubenswrapper[4732]: E1010 08:34:08.695763 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183\": container with ID starting with 202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183 not found: ID does not exist" containerID="202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183" Oct 10 08:34:08 crc kubenswrapper[4732]: I1010 08:34:08.695785 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183"} err="failed to get container status \"202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183\": rpc error: code = NotFound desc = could not find container \"202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183\": container with ID starting with 202262ab2a5134be29d67ad8627f8f29374f3a70a473a6ab167198024a518183 not found: ID does not exist" Oct 10 08:34:09 crc kubenswrapper[4732]: I1010 08:34:09.672317 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" path="/var/lib/kubelet/pods/ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0/volumes" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.175114 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8"] Oct 10 08:34:12 crc kubenswrapper[4732]: E1010 08:34:12.177134 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="extract-content" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.177238 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="extract-content" Oct 10 08:34:12 crc kubenswrapper[4732]: E1010 08:34:12.177343 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerName="dnsmasq-dns" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.177414 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerName="dnsmasq-dns" Oct 10 08:34:12 crc kubenswrapper[4732]: E1010 08:34:12.177486 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="extract-utilities" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.177561 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="extract-utilities" Oct 10 08:34:12 crc kubenswrapper[4732]: E1010 08:34:12.177641 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="registry-server" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.177726 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="registry-server" Oct 10 08:34:12 crc kubenswrapper[4732]: E1010 08:34:12.177820 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerName="init" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.177907 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerName="init" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.178185 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee4b9cc1-ebe4-4dd3-9db8-48e69fefffc0" containerName="registry-server" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.178268 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5dc646-8801-494b-a9d8-ab1ddb930b28" containerName="dnsmasq-dns" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.179543 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.182274 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.182362 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.182405 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.188829 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.195672 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8"] Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.325533 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7jv\" (UniqueName: \"kubernetes.io/projected/56daaa72-0707-450e-946f-649e04f9a0bc-kube-api-access-lw7jv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.325989 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.326103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.326227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.429574 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7jv\" (UniqueName: \"kubernetes.io/projected/56daaa72-0707-450e-946f-649e04f9a0bc-kube-api-access-lw7jv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.429847 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.429899 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.429933 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.438083 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.438963 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.439218 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.445414 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7jv\" (UniqueName: \"kubernetes.io/projected/56daaa72-0707-450e-946f-649e04f9a0bc-kube-api-access-lw7jv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:12 crc kubenswrapper[4732]: I1010 08:34:12.501763 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:13 crc kubenswrapper[4732]: I1010 08:34:13.045159 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0b3a-account-create-6ch6l"] Oct 10 08:34:13 crc kubenswrapper[4732]: I1010 08:34:13.054158 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0b3a-account-create-6ch6l"] Oct 10 08:34:13 crc kubenswrapper[4732]: I1010 08:34:13.078867 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8"] Oct 10 08:34:13 crc kubenswrapper[4732]: I1010 08:34:13.674974 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5049d408-b2a1-43a6-a2de-9516b7d0c78e" path="/var/lib/kubelet/pods/5049d408-b2a1-43a6-a2de-9516b7d0c78e/volumes" Oct 10 08:34:13 crc kubenswrapper[4732]: I1010 08:34:13.676616 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" event={"ID":"56daaa72-0707-450e-946f-649e04f9a0bc","Type":"ContainerStarted","Data":"db83d701772aa31172e6192b62da5c96ec450d454b24b31741c355f4d0162159"} Oct 10 08:34:21 crc kubenswrapper[4732]: I1010 08:34:21.667153 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:34:22 crc kubenswrapper[4732]: I1010 08:34:22.753575 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" event={"ID":"56daaa72-0707-450e-946f-649e04f9a0bc","Type":"ContainerStarted","Data":"23822ea877707fd7a7f8fdb15ee31a550374dadb370f0c7d4b6a61b08014825c"} Oct 10 08:34:22 crc kubenswrapper[4732]: I1010 08:34:22.784654 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" podStartSLOduration=2.207797146 podStartE2EDuration="10.784634774s" podCreationTimestamp="2025-10-10 08:34:12 +0000 UTC" firstStartedPulling="2025-10-10 08:34:13.087395185 +0000 UTC m=+6180.156986426" lastFinishedPulling="2025-10-10 08:34:21.664232813 +0000 UTC m=+6188.733824054" observedRunningTime="2025-10-10 08:34:22.771099662 +0000 UTC m=+6189.840690913" watchObservedRunningTime="2025-10-10 08:34:22.784634774 +0000 UTC m=+6189.854226015" Oct 10 08:34:23 crc kubenswrapper[4732]: I1010 08:34:23.033287 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-lw22c"] Oct 10 08:34:23 crc kubenswrapper[4732]: I1010 08:34:23.043199 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-lw22c"] Oct 10 08:34:23 crc kubenswrapper[4732]: I1010 08:34:23.674060 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5" path="/var/lib/kubelet/pods/c8b7102b-f80d-4b3a-8c8d-1baaedf67ca5/volumes" Oct 10 08:34:34 crc kubenswrapper[4732]: I1010 08:34:34.874873 4732 generic.go:334] "Generic (PLEG): container finished" podID="56daaa72-0707-450e-946f-649e04f9a0bc" containerID="23822ea877707fd7a7f8fdb15ee31a550374dadb370f0c7d4b6a61b08014825c" exitCode=0 Oct 10 08:34:34 crc kubenswrapper[4732]: I1010 08:34:34.874947 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" event={"ID":"56daaa72-0707-450e-946f-649e04f9a0bc","Type":"ContainerDied","Data":"23822ea877707fd7a7f8fdb15ee31a550374dadb370f0c7d4b6a61b08014825c"} Oct 10 08:34:35 crc kubenswrapper[4732]: I1010 08:34:35.984231 4732 scope.go:117] "RemoveContainer" containerID="59704af831b3620080dac096d92407b5de47d17447bf6674c20039cdd019fc6b" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.022320 4732 scope.go:117] "RemoveContainer" containerID="5a16c08c0398ebf8deb1bd17987e4d6b9fe7191d41ef8684d7ecb87beed51d98" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.069358 4732 scope.go:117] "RemoveContainer" containerID="5eb88d0285472497021f9892bac755c67c0bec73c260aad2b46c6243b83d3d87" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.362172 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.443223 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-pre-adoption-validation-combined-ca-bundle\") pod \"56daaa72-0707-450e-946f-649e04f9a0bc\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.443617 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-inventory\") pod \"56daaa72-0707-450e-946f-649e04f9a0bc\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.443799 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw7jv\" (UniqueName: \"kubernetes.io/projected/56daaa72-0707-450e-946f-649e04f9a0bc-kube-api-access-lw7jv\") pod \"56daaa72-0707-450e-946f-649e04f9a0bc\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.443863 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-ssh-key\") pod \"56daaa72-0707-450e-946f-649e04f9a0bc\" (UID: \"56daaa72-0707-450e-946f-649e04f9a0bc\") " Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.449269 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56daaa72-0707-450e-946f-649e04f9a0bc-kube-api-access-lw7jv" (OuterVolumeSpecName: "kube-api-access-lw7jv") pod "56daaa72-0707-450e-946f-649e04f9a0bc" (UID: "56daaa72-0707-450e-946f-649e04f9a0bc"). InnerVolumeSpecName "kube-api-access-lw7jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.449589 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "56daaa72-0707-450e-946f-649e04f9a0bc" (UID: "56daaa72-0707-450e-946f-649e04f9a0bc"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.470637 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "56daaa72-0707-450e-946f-649e04f9a0bc" (UID: "56daaa72-0707-450e-946f-649e04f9a0bc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.472007 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-inventory" (OuterVolumeSpecName: "inventory") pod "56daaa72-0707-450e-946f-649e04f9a0bc" (UID: "56daaa72-0707-450e-946f-649e04f9a0bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.547057 4732 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.547161 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.547222 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw7jv\" (UniqueName: \"kubernetes.io/projected/56daaa72-0707-450e-946f-649e04f9a0bc-kube-api-access-lw7jv\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.547245 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/56daaa72-0707-450e-946f-649e04f9a0bc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.896869 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" event={"ID":"56daaa72-0707-450e-946f-649e04f9a0bc","Type":"ContainerDied","Data":"db83d701772aa31172e6192b62da5c96ec450d454b24b31741c355f4d0162159"} Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.896937 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db83d701772aa31172e6192b62da5c96ec450d454b24b31741c355f4d0162159" Oct 10 08:34:36 crc kubenswrapper[4732]: I1010 08:34:36.896937 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.129064 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq"] Oct 10 08:34:45 crc kubenswrapper[4732]: E1010 08:34:45.131194 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56daaa72-0707-450e-946f-649e04f9a0bc" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.131313 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56daaa72-0707-450e-946f-649e04f9a0bc" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.131659 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56daaa72-0707-450e-946f-649e04f9a0bc" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.132610 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.136212 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.136372 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.136794 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.137136 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.148077 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq"] Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.223905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.224353 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f24fk\" (UniqueName: \"kubernetes.io/projected/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-kube-api-access-f24fk\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.224457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.224586 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.327302 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f24fk\" (UniqueName: \"kubernetes.io/projected/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-kube-api-access-f24fk\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.327898 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.327985 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.328083 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.335410 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.335830 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.337417 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.345462 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f24fk\" (UniqueName: \"kubernetes.io/projected/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-kube-api-access-f24fk\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:45 crc kubenswrapper[4732]: I1010 08:34:45.458903 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:34:46 crc kubenswrapper[4732]: I1010 08:34:46.000154 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq"] Oct 10 08:34:47 crc kubenswrapper[4732]: I1010 08:34:47.000664 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" event={"ID":"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19","Type":"ContainerStarted","Data":"118559cb653174b800879919597672796acf9fcac4227e239057ccd9bb10f369"} Oct 10 08:34:47 crc kubenswrapper[4732]: I1010 08:34:47.001431 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" event={"ID":"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19","Type":"ContainerStarted","Data":"2b9c7ab4fcd5911c75072fda4430a5b3f9da03e1fb960f84767821a6744afbca"} Oct 10 08:34:47 crc kubenswrapper[4732]: I1010 08:34:47.024572 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" podStartSLOduration=1.463931493 podStartE2EDuration="2.024548655s" podCreationTimestamp="2025-10-10 08:34:45 +0000 UTC" firstStartedPulling="2025-10-10 08:34:46.004886501 +0000 UTC m=+6213.074477732" lastFinishedPulling="2025-10-10 08:34:46.565503653 +0000 UTC m=+6213.635094894" observedRunningTime="2025-10-10 08:34:47.019734816 +0000 UTC m=+6214.089326057" watchObservedRunningTime="2025-10-10 08:34:47.024548655 +0000 UTC m=+6214.094139926" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.187226 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-75h7v"] Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.190267 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.202146 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75h7v"] Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.278725 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-catalog-content\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.278783 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-utilities\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.278872 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mn8n\" (UniqueName: \"kubernetes.io/projected/ba635dcb-929b-4ea1-9a7a-495b36f06df2-kube-api-access-8mn8n\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.381844 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-catalog-content\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.381892 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-utilities\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.381953 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mn8n\" (UniqueName: \"kubernetes.io/projected/ba635dcb-929b-4ea1-9a7a-495b36f06df2-kube-api-access-8mn8n\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.382361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-catalog-content\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.382453 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-utilities\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.406833 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mn8n\" (UniqueName: \"kubernetes.io/projected/ba635dcb-929b-4ea1-9a7a-495b36f06df2-kube-api-access-8mn8n\") pod \"certified-operators-75h7v\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.524275 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:34:52 crc kubenswrapper[4732]: I1010 08:34:52.955091 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-75h7v"] Oct 10 08:34:53 crc kubenswrapper[4732]: I1010 08:34:53.057753 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75h7v" event={"ID":"ba635dcb-929b-4ea1-9a7a-495b36f06df2","Type":"ContainerStarted","Data":"cf51817a71dab3fd0b1f778e3fcf23668ffd8a085f6ad6c37e5bf4d0476a638b"} Oct 10 08:34:54 crc kubenswrapper[4732]: I1010 08:34:54.074038 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerID="3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7" exitCode=0 Oct 10 08:34:54 crc kubenswrapper[4732]: I1010 08:34:54.074131 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75h7v" event={"ID":"ba635dcb-929b-4ea1-9a7a-495b36f06df2","Type":"ContainerDied","Data":"3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7"} Oct 10 08:34:56 crc kubenswrapper[4732]: I1010 08:34:56.098074 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75h7v" event={"ID":"ba635dcb-929b-4ea1-9a7a-495b36f06df2","Type":"ContainerStarted","Data":"efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68"} Oct 10 08:34:56 crc kubenswrapper[4732]: E1010 08:34:56.518443 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba635dcb_929b_4ea1_9a7a_495b36f06df2.slice/crio-efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba635dcb_929b_4ea1_9a7a_495b36f06df2.slice/crio-conmon-efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:34:57 crc kubenswrapper[4732]: I1010 08:34:57.110058 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerID="efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68" exitCode=0 Oct 10 08:34:57 crc kubenswrapper[4732]: I1010 08:34:57.110104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75h7v" event={"ID":"ba635dcb-929b-4ea1-9a7a-495b36f06df2","Type":"ContainerDied","Data":"efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68"} Oct 10 08:34:58 crc kubenswrapper[4732]: I1010 08:34:58.122063 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75h7v" event={"ID":"ba635dcb-929b-4ea1-9a7a-495b36f06df2","Type":"ContainerStarted","Data":"baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207"} Oct 10 08:34:58 crc kubenswrapper[4732]: I1010 08:34:58.144717 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-75h7v" podStartSLOduration=2.700569872 podStartE2EDuration="6.144682847s" podCreationTimestamp="2025-10-10 08:34:52 +0000 UTC" firstStartedPulling="2025-10-10 08:34:54.077445086 +0000 UTC m=+6221.147036327" lastFinishedPulling="2025-10-10 08:34:57.521558061 +0000 UTC m=+6224.591149302" observedRunningTime="2025-10-10 08:34:58.140508776 +0000 UTC m=+6225.210100027" watchObservedRunningTime="2025-10-10 08:34:58.144682847 +0000 UTC m=+6225.214274088" Oct 10 08:35:02 crc kubenswrapper[4732]: I1010 08:35:02.524436 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:35:02 crc kubenswrapper[4732]: I1010 08:35:02.525012 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:35:02 crc kubenswrapper[4732]: I1010 08:35:02.574191 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:35:03 crc kubenswrapper[4732]: I1010 08:35:03.244501 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:35:03 crc kubenswrapper[4732]: I1010 08:35:03.290319 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75h7v"] Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.190096 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-75h7v" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="registry-server" containerID="cri-o://baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207" gracePeriod=2 Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.700682 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.872545 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mn8n\" (UniqueName: \"kubernetes.io/projected/ba635dcb-929b-4ea1-9a7a-495b36f06df2-kube-api-access-8mn8n\") pod \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.872964 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-utilities\") pod \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.873073 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-catalog-content\") pod \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\" (UID: \"ba635dcb-929b-4ea1-9a7a-495b36f06df2\") " Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.874959 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-utilities" (OuterVolumeSpecName: "utilities") pod "ba635dcb-929b-4ea1-9a7a-495b36f06df2" (UID: "ba635dcb-929b-4ea1-9a7a-495b36f06df2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.878163 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba635dcb-929b-4ea1-9a7a-495b36f06df2-kube-api-access-8mn8n" (OuterVolumeSpecName: "kube-api-access-8mn8n") pod "ba635dcb-929b-4ea1-9a7a-495b36f06df2" (UID: "ba635dcb-929b-4ea1-9a7a-495b36f06df2"). InnerVolumeSpecName "kube-api-access-8mn8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.926238 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba635dcb-929b-4ea1-9a7a-495b36f06df2" (UID: "ba635dcb-929b-4ea1-9a7a-495b36f06df2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.975492 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mn8n\" (UniqueName: \"kubernetes.io/projected/ba635dcb-929b-4ea1-9a7a-495b36f06df2-kube-api-access-8mn8n\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.975530 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:05 crc kubenswrapper[4732]: I1010 08:35:05.975540 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba635dcb-929b-4ea1-9a7a-495b36f06df2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.217899 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerID="baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207" exitCode=0 Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.217945 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75h7v" event={"ID":"ba635dcb-929b-4ea1-9a7a-495b36f06df2","Type":"ContainerDied","Data":"baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207"} Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.217994 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-75h7v" event={"ID":"ba635dcb-929b-4ea1-9a7a-495b36f06df2","Type":"ContainerDied","Data":"cf51817a71dab3fd0b1f778e3fcf23668ffd8a085f6ad6c37e5bf4d0476a638b"} Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.218015 4732 scope.go:117] "RemoveContainer" containerID="baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.217957 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-75h7v" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.250262 4732 scope.go:117] "RemoveContainer" containerID="efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.263970 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-75h7v"] Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.273302 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-75h7v"] Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.275967 4732 scope.go:117] "RemoveContainer" containerID="3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.329855 4732 scope.go:117] "RemoveContainer" containerID="baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207" Oct 10 08:35:06 crc kubenswrapper[4732]: E1010 08:35:06.330217 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207\": container with ID starting with baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207 not found: ID does not exist" containerID="baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.330243 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207"} err="failed to get container status \"baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207\": rpc error: code = NotFound desc = could not find container \"baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207\": container with ID starting with baf8031c59847adaabd00032b2a4df5a000f73f3300ebc5758fce8f745a58207 not found: ID does not exist" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.330268 4732 scope.go:117] "RemoveContainer" containerID="efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68" Oct 10 08:35:06 crc kubenswrapper[4732]: E1010 08:35:06.330635 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68\": container with ID starting with efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68 not found: ID does not exist" containerID="efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.330667 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68"} err="failed to get container status \"efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68\": rpc error: code = NotFound desc = could not find container \"efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68\": container with ID starting with efb702dfeeaf6e3995a604cd53feb96957fe6480529bee6464cd77f40f721e68 not found: ID does not exist" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.330701 4732 scope.go:117] "RemoveContainer" containerID="3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7" Oct 10 08:35:06 crc kubenswrapper[4732]: E1010 08:35:06.331374 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7\": container with ID starting with 3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7 not found: ID does not exist" containerID="3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7" Oct 10 08:35:06 crc kubenswrapper[4732]: I1010 08:35:06.331448 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7"} err="failed to get container status \"3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7\": rpc error: code = NotFound desc = could not find container \"3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7\": container with ID starting with 3114ef319f78db5a357562e9adfe41a6e257c6933bce58c02b574884442c99e7 not found: ID does not exist" Oct 10 08:35:07 crc kubenswrapper[4732]: I1010 08:35:07.671896 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" path="/var/lib/kubelet/pods/ba635dcb-929b-4ea1-9a7a-495b36f06df2/volumes" Oct 10 08:35:22 crc kubenswrapper[4732]: I1010 08:35:22.041200 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-t6gsv"] Oct 10 08:35:22 crc kubenswrapper[4732]: I1010 08:35:22.051375 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-t6gsv"] Oct 10 08:35:23 crc kubenswrapper[4732]: I1010 08:35:23.033117 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jxdbl"] Oct 10 08:35:23 crc kubenswrapper[4732]: I1010 08:35:23.044008 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xnvl6"] Oct 10 08:35:23 crc kubenswrapper[4732]: I1010 08:35:23.055782 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jxdbl"] Oct 10 08:35:23 crc kubenswrapper[4732]: I1010 08:35:23.066059 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xnvl6"] Oct 10 08:35:23 crc kubenswrapper[4732]: I1010 08:35:23.672064 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39795c3d-ce50-4e53-befa-12c4619a7e26" path="/var/lib/kubelet/pods/39795c3d-ce50-4e53-befa-12c4619a7e26/volumes" Oct 10 08:35:23 crc kubenswrapper[4732]: I1010 08:35:23.672569 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c98e014-24c6-4c11-9965-34dca9a8aa12" path="/var/lib/kubelet/pods/3c98e014-24c6-4c11-9965-34dca9a8aa12/volumes" Oct 10 08:35:23 crc kubenswrapper[4732]: I1010 08:35:23.673220 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5adc8e93-5e71-44c1-a74a-45498406543a" path="/var/lib/kubelet/pods/5adc8e93-5e71-44c1-a74a-45498406543a/volumes" Oct 10 08:35:25 crc kubenswrapper[4732]: I1010 08:35:25.356074 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:35:25 crc kubenswrapper[4732]: I1010 08:35:25.356166 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:35:32 crc kubenswrapper[4732]: I1010 08:35:32.054936 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-737a-account-create-z248t"] Oct 10 08:35:32 crc kubenswrapper[4732]: I1010 08:35:32.067646 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-737a-account-create-z248t"] Oct 10 08:35:33 crc kubenswrapper[4732]: I1010 08:35:33.034138 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8455-account-create-m7fpc"] Oct 10 08:35:33 crc kubenswrapper[4732]: I1010 08:35:33.046131 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8ad3-account-create-frqct"] Oct 10 08:35:33 crc kubenswrapper[4732]: I1010 08:35:33.056282 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8455-account-create-m7fpc"] Oct 10 08:35:33 crc kubenswrapper[4732]: I1010 08:35:33.070955 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8ad3-account-create-frqct"] Oct 10 08:35:33 crc kubenswrapper[4732]: I1010 08:35:33.670747 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655f526b-2365-4ecd-b3d5-7d60beffedf1" path="/var/lib/kubelet/pods/655f526b-2365-4ecd-b3d5-7d60beffedf1/volumes" Oct 10 08:35:33 crc kubenswrapper[4732]: I1010 08:35:33.671348 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5fba53e-2bc4-4a82-982a-22f39e81a78f" path="/var/lib/kubelet/pods/d5fba53e-2bc4-4a82-982a-22f39e81a78f/volumes" Oct 10 08:35:33 crc kubenswrapper[4732]: I1010 08:35:33.671888 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d68ece14-1313-4f06-bce9-a46535685ad4" path="/var/lib/kubelet/pods/d68ece14-1313-4f06-bce9-a46535685ad4/volumes" Oct 10 08:35:36 crc kubenswrapper[4732]: I1010 08:35:36.220200 4732 scope.go:117] "RemoveContainer" containerID="b0031338a699ef23aa284fecd5b17da30969f81adc3bdfff92ccd5824a2155ad" Oct 10 08:35:36 crc kubenswrapper[4732]: I1010 08:35:36.253780 4732 scope.go:117] "RemoveContainer" containerID="65e339abb51245eb40d70c18c0e77adabb6cd759059a0fcf4bc70c3fdf875a34" Oct 10 08:35:36 crc kubenswrapper[4732]: I1010 08:35:36.298087 4732 scope.go:117] "RemoveContainer" containerID="51f41b6697f8b6a0ceb62dd647fa5eb7a5c062a7b355ccda36b137b07f784c75" Oct 10 08:35:36 crc kubenswrapper[4732]: I1010 08:35:36.343582 4732 scope.go:117] "RemoveContainer" containerID="6388b7d2c21e1bfad4dc15434200ea4a8cf75be618f410d97a0216d579f2b541" Oct 10 08:35:36 crc kubenswrapper[4732]: I1010 08:35:36.391828 4732 scope.go:117] "RemoveContainer" containerID="1326d8d9629b26dfca3f549d282a87e59e32f2e45fe3a13dc5788e695263aa74" Oct 10 08:35:36 crc kubenswrapper[4732]: I1010 08:35:36.446855 4732 scope.go:117] "RemoveContainer" containerID="1c673562f6bacf4507a50dfaa1eeabd5c3a2ac05cde1ce7d47a31dcdd9ee8391" Oct 10 08:35:51 crc kubenswrapper[4732]: I1010 08:35:51.050740 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7ww"] Oct 10 08:35:51 crc kubenswrapper[4732]: I1010 08:35:51.060011 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zb7ww"] Oct 10 08:35:51 crc kubenswrapper[4732]: I1010 08:35:51.674385 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac21c01-cdf9-4adf-a9ae-19219c311c33" path="/var/lib/kubelet/pods/cac21c01-cdf9-4adf-a9ae-19219c311c33/volumes" Oct 10 08:35:55 crc kubenswrapper[4732]: I1010 08:35:55.357481 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:35:55 crc kubenswrapper[4732]: I1010 08:35:55.357765 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:36:10 crc kubenswrapper[4732]: I1010 08:36:10.054876 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xl7lk"] Oct 10 08:36:10 crc kubenswrapper[4732]: I1010 08:36:10.069297 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xl7lk"] Oct 10 08:36:11 crc kubenswrapper[4732]: I1010 08:36:11.032641 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mzxtv"] Oct 10 08:36:11 crc kubenswrapper[4732]: I1010 08:36:11.043840 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mzxtv"] Oct 10 08:36:11 crc kubenswrapper[4732]: I1010 08:36:11.676939 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50c57446-a9d6-4657-991f-7c4bd7cf0aa8" path="/var/lib/kubelet/pods/50c57446-a9d6-4657-991f-7c4bd7cf0aa8/volumes" Oct 10 08:36:11 crc kubenswrapper[4732]: I1010 08:36:11.680231 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6702bc05-24bf-45f4-96b2-994a19c2a40e" path="/var/lib/kubelet/pods/6702bc05-24bf-45f4-96b2-994a19c2a40e/volumes" Oct 10 08:36:25 crc kubenswrapper[4732]: I1010 08:36:25.355867 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:36:25 crc kubenswrapper[4732]: I1010 08:36:25.356855 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:36:25 crc kubenswrapper[4732]: I1010 08:36:25.356937 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:36:25 crc kubenswrapper[4732]: I1010 08:36:25.358186 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:36:25 crc kubenswrapper[4732]: I1010 08:36:25.358253 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" gracePeriod=600 Oct 10 08:36:25 crc kubenswrapper[4732]: E1010 08:36:25.495793 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:36:26 crc kubenswrapper[4732]: I1010 08:36:26.082957 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" exitCode=0 Oct 10 08:36:26 crc kubenswrapper[4732]: I1010 08:36:26.083002 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff"} Oct 10 08:36:26 crc kubenswrapper[4732]: I1010 08:36:26.083304 4732 scope.go:117] "RemoveContainer" containerID="bf642036695cfd106f68d43cb38c2eaa4d9337092959ec0ecee4949724bd8c41" Oct 10 08:36:26 crc kubenswrapper[4732]: I1010 08:36:26.084328 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:36:26 crc kubenswrapper[4732]: E1010 08:36:26.084843 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:36:36 crc kubenswrapper[4732]: I1010 08:36:36.622457 4732 scope.go:117] "RemoveContainer" containerID="b8dd24412467df82499f51adb57d84b91a04b84116b652042d87019d87b25cd6" Oct 10 08:36:36 crc kubenswrapper[4732]: I1010 08:36:36.667340 4732 scope.go:117] "RemoveContainer" containerID="c52c7a7628f1c740ba714e3f51d19a114b0b3e955dcd6c66e6e68b05dfa54f68" Oct 10 08:36:36 crc kubenswrapper[4732]: I1010 08:36:36.735434 4732 scope.go:117] "RemoveContainer" containerID="1dedbc4f9d946fb0f20248d0d35cbe013c515f1d2d08a68b8f07bde1c1d53841" Oct 10 08:36:41 crc kubenswrapper[4732]: I1010 08:36:41.660992 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:36:41 crc kubenswrapper[4732]: E1010 08:36:41.662152 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:36:53 crc kubenswrapper[4732]: I1010 08:36:53.675288 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:36:53 crc kubenswrapper[4732]: E1010 08:36:53.676303 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:36:57 crc kubenswrapper[4732]: I1010 08:36:57.064703 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jwcrg"] Oct 10 08:36:57 crc kubenswrapper[4732]: I1010 08:36:57.076257 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jwcrg"] Oct 10 08:36:57 crc kubenswrapper[4732]: I1010 08:36:57.694216 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f73311-ec18-40b6-8a6f-00c817d7f036" path="/var/lib/kubelet/pods/c5f73311-ec18-40b6-8a6f-00c817d7f036/volumes" Oct 10 08:37:06 crc kubenswrapper[4732]: I1010 08:37:06.660221 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:37:06 crc kubenswrapper[4732]: E1010 08:37:06.663085 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:37:17 crc kubenswrapper[4732]: I1010 08:37:17.661130 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:37:17 crc kubenswrapper[4732]: E1010 08:37:17.662292 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:37:31 crc kubenswrapper[4732]: I1010 08:37:31.660248 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:37:31 crc kubenswrapper[4732]: E1010 08:37:31.662517 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:37:36 crc kubenswrapper[4732]: I1010 08:37:36.844939 4732 scope.go:117] "RemoveContainer" containerID="11f5443b312edccbfab75fffb8bd3bf709de7f3b2f858549ed8e5d76bb956cb0" Oct 10 08:37:36 crc kubenswrapper[4732]: I1010 08:37:36.876982 4732 scope.go:117] "RemoveContainer" containerID="2bf76e2c58b875473328fc3477fa1c6125d874c2543e982e0892979fb9f7f668" Oct 10 08:37:36 crc kubenswrapper[4732]: I1010 08:37:36.934850 4732 scope.go:117] "RemoveContainer" containerID="a75bb423b7f2e4c9f9707efddc211ef545dbca4282beddf20c0d424987ba9c06" Oct 10 08:37:37 crc kubenswrapper[4732]: I1010 08:37:37.001175 4732 scope.go:117] "RemoveContainer" containerID="9c015451911a5023fabf8fdc2342cdc71b6a2628d2de674a8a755d5db9a9f463" Oct 10 08:37:43 crc kubenswrapper[4732]: I1010 08:37:43.667183 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:37:43 crc kubenswrapper[4732]: E1010 08:37:43.667986 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:37:55 crc kubenswrapper[4732]: I1010 08:37:55.660874 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:37:55 crc kubenswrapper[4732]: E1010 08:37:55.662104 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:38:06 crc kubenswrapper[4732]: I1010 08:38:06.660443 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:38:06 crc kubenswrapper[4732]: E1010 08:38:06.661654 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:38:18 crc kubenswrapper[4732]: I1010 08:38:18.660390 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:38:18 crc kubenswrapper[4732]: E1010 08:38:18.661419 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:38:30 crc kubenswrapper[4732]: I1010 08:38:30.660569 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:38:30 crc kubenswrapper[4732]: E1010 08:38:30.661571 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:38:42 crc kubenswrapper[4732]: I1010 08:38:42.660124 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:38:42 crc kubenswrapper[4732]: E1010 08:38:42.660824 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:38:53 crc kubenswrapper[4732]: I1010 08:38:53.674079 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:38:53 crc kubenswrapper[4732]: E1010 08:38:53.675022 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:39:04 crc kubenswrapper[4732]: I1010 08:39:04.660140 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:39:04 crc kubenswrapper[4732]: E1010 08:39:04.661025 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:39:19 crc kubenswrapper[4732]: I1010 08:39:19.661172 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:39:19 crc kubenswrapper[4732]: E1010 08:39:19.662086 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:39:32 crc kubenswrapper[4732]: I1010 08:39:32.659951 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:39:32 crc kubenswrapper[4732]: E1010 08:39:32.660706 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:39:36 crc kubenswrapper[4732]: I1010 08:39:36.038731 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-wz54z"] Oct 10 08:39:36 crc kubenswrapper[4732]: I1010 08:39:36.047781 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-wz54z"] Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.020675 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4g9pr"] Oct 10 08:39:37 crc kubenswrapper[4732]: E1010 08:39:37.021844 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="registry-server" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.021887 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="registry-server" Oct 10 08:39:37 crc kubenswrapper[4732]: E1010 08:39:37.021921 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="extract-utilities" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.021939 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="extract-utilities" Oct 10 08:39:37 crc kubenswrapper[4732]: E1010 08:39:37.021975 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="extract-content" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.021995 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="extract-content" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.022479 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba635dcb-929b-4ea1-9a7a-495b36f06df2" containerName="registry-server" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.026115 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.034270 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4g9pr"] Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.098999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-utilities\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.099064 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-catalog-content\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.099279 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dml9\" (UniqueName: \"kubernetes.io/projected/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-kube-api-access-2dml9\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.201878 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-utilities\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.201930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-catalog-content\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.202045 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dml9\" (UniqueName: \"kubernetes.io/projected/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-kube-api-access-2dml9\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.202465 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-utilities\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.202474 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-catalog-content\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.220992 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dml9\" (UniqueName: \"kubernetes.io/projected/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-kube-api-access-2dml9\") pod \"redhat-operators-4g9pr\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.360862 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.675022 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87466246-6e08-4e6d-9dd6-50f57a188992" path="/var/lib/kubelet/pods/87466246-6e08-4e6d-9dd6-50f57a188992/volumes" Oct 10 08:39:37 crc kubenswrapper[4732]: I1010 08:39:37.818679 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4g9pr"] Oct 10 08:39:38 crc kubenswrapper[4732]: I1010 08:39:38.136263 4732 generic.go:334] "Generic (PLEG): container finished" podID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerID="f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f" exitCode=0 Oct 10 08:39:38 crc kubenswrapper[4732]: I1010 08:39:38.136312 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g9pr" event={"ID":"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd","Type":"ContainerDied","Data":"f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f"} Oct 10 08:39:38 crc kubenswrapper[4732]: I1010 08:39:38.136341 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g9pr" event={"ID":"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd","Type":"ContainerStarted","Data":"b4f13730841c389e7994321795ccf465a461d201e38c16d481caf0ad1f975ef8"} Oct 10 08:39:38 crc kubenswrapper[4732]: I1010 08:39:38.138632 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:39:40 crc kubenswrapper[4732]: I1010 08:39:40.167509 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g9pr" event={"ID":"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd","Type":"ContainerStarted","Data":"72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946"} Oct 10 08:39:44 crc kubenswrapper[4732]: I1010 08:39:44.212012 4732 generic.go:334] "Generic (PLEG): container finished" podID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerID="72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946" exitCode=0 Oct 10 08:39:44 crc kubenswrapper[4732]: I1010 08:39:44.212112 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g9pr" event={"ID":"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd","Type":"ContainerDied","Data":"72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946"} Oct 10 08:39:45 crc kubenswrapper[4732]: I1010 08:39:45.660999 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:39:45 crc kubenswrapper[4732]: E1010 08:39:45.661734 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:39:46 crc kubenswrapper[4732]: I1010 08:39:46.039995 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-2708-account-create-rtsmq"] Oct 10 08:39:46 crc kubenswrapper[4732]: I1010 08:39:46.047670 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-2708-account-create-rtsmq"] Oct 10 08:39:46 crc kubenswrapper[4732]: I1010 08:39:46.235109 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g9pr" event={"ID":"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd","Type":"ContainerStarted","Data":"cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883"} Oct 10 08:39:46 crc kubenswrapper[4732]: I1010 08:39:46.260474 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4g9pr" podStartSLOduration=2.904559924 podStartE2EDuration="10.260444989s" podCreationTimestamp="2025-10-10 08:39:36 +0000 UTC" firstStartedPulling="2025-10-10 08:39:38.138295945 +0000 UTC m=+6505.207887186" lastFinishedPulling="2025-10-10 08:39:45.49418099 +0000 UTC m=+6512.563772251" observedRunningTime="2025-10-10 08:39:46.252454425 +0000 UTC m=+6513.322045706" watchObservedRunningTime="2025-10-10 08:39:46.260444989 +0000 UTC m=+6513.330036260" Oct 10 08:39:47 crc kubenswrapper[4732]: I1010 08:39:47.361256 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:47 crc kubenswrapper[4732]: I1010 08:39:47.361600 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:47 crc kubenswrapper[4732]: I1010 08:39:47.671478 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f199b1c8-bd78-4dda-b8ef-fe655bfedee1" path="/var/lib/kubelet/pods/f199b1c8-bd78-4dda-b8ef-fe655bfedee1/volumes" Oct 10 08:39:48 crc kubenswrapper[4732]: I1010 08:39:48.414301 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4g9pr" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="registry-server" probeResult="failure" output=< Oct 10 08:39:48 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 08:39:48 crc kubenswrapper[4732]: > Oct 10 08:39:57 crc kubenswrapper[4732]: I1010 08:39:57.415017 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:57 crc kubenswrapper[4732]: I1010 08:39:57.465834 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:57 crc kubenswrapper[4732]: I1010 08:39:57.684866 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4g9pr"] Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.357839 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4g9pr" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="registry-server" containerID="cri-o://cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883" gracePeriod=2 Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.660302 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:39:59 crc kubenswrapper[4732]: E1010 08:39:59.660875 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.856266 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.992037 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-utilities\") pod \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.992136 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-catalog-content\") pod \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.992219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dml9\" (UniqueName: \"kubernetes.io/projected/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-kube-api-access-2dml9\") pod \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\" (UID: \"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd\") " Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.993675 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-utilities" (OuterVolumeSpecName: "utilities") pod "5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" (UID: "5ae3d1bb-4b16-4730-857c-2f00dd77bcdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:39:59 crc kubenswrapper[4732]: I1010 08:39:59.997867 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-kube-api-access-2dml9" (OuterVolumeSpecName: "kube-api-access-2dml9") pod "5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" (UID: "5ae3d1bb-4b16-4730-857c-2f00dd77bcdd"). InnerVolumeSpecName "kube-api-access-2dml9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.083167 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" (UID: "5ae3d1bb-4b16-4730-857c-2f00dd77bcdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.094902 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.094931 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.094942 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dml9\" (UniqueName: \"kubernetes.io/projected/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd-kube-api-access-2dml9\") on node \"crc\" DevicePath \"\"" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.373124 4732 generic.go:334] "Generic (PLEG): container finished" podID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerID="cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883" exitCode=0 Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.373192 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g9pr" event={"ID":"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd","Type":"ContainerDied","Data":"cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883"} Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.373217 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4g9pr" event={"ID":"5ae3d1bb-4b16-4730-857c-2f00dd77bcdd","Type":"ContainerDied","Data":"b4f13730841c389e7994321795ccf465a461d201e38c16d481caf0ad1f975ef8"} Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.373234 4732 scope.go:117] "RemoveContainer" containerID="cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.373238 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4g9pr" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.417808 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4g9pr"] Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.423274 4732 scope.go:117] "RemoveContainer" containerID="72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.427717 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4g9pr"] Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.445292 4732 scope.go:117] "RemoveContainer" containerID="f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.487725 4732 scope.go:117] "RemoveContainer" containerID="cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883" Oct 10 08:40:00 crc kubenswrapper[4732]: E1010 08:40:00.488188 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883\": container with ID starting with cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883 not found: ID does not exist" containerID="cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.488300 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883"} err="failed to get container status \"cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883\": rpc error: code = NotFound desc = could not find container \"cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883\": container with ID starting with cc9813a54c29826571f6aa9f5d1e56617d26d113c69c0112b698c8c828475883 not found: ID does not exist" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.488410 4732 scope.go:117] "RemoveContainer" containerID="72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946" Oct 10 08:40:00 crc kubenswrapper[4732]: E1010 08:40:00.488818 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946\": container with ID starting with 72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946 not found: ID does not exist" containerID="72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.488854 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946"} err="failed to get container status \"72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946\": rpc error: code = NotFound desc = could not find container \"72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946\": container with ID starting with 72f1c95febc16bee0b119d1c26de9c91f6a76e7dcb863f77b16f567cc20f4946 not found: ID does not exist" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.488872 4732 scope.go:117] "RemoveContainer" containerID="f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f" Oct 10 08:40:00 crc kubenswrapper[4732]: E1010 08:40:00.489109 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f\": container with ID starting with f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f not found: ID does not exist" containerID="f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f" Oct 10 08:40:00 crc kubenswrapper[4732]: I1010 08:40:00.489133 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f"} err="failed to get container status \"f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f\": rpc error: code = NotFound desc = could not find container \"f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f\": container with ID starting with f10a8605ee920a26430ca9ac1fa31cda08603af9ba9f470eb7a0b205eb058a2f not found: ID does not exist" Oct 10 08:40:01 crc kubenswrapper[4732]: I1010 08:40:01.051371 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-9c2bk"] Oct 10 08:40:01 crc kubenswrapper[4732]: I1010 08:40:01.066944 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-9c2bk"] Oct 10 08:40:01 crc kubenswrapper[4732]: I1010 08:40:01.680169 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b0a935b-06d8-47e6-856f-d7f9b048d366" path="/var/lib/kubelet/pods/0b0a935b-06d8-47e6-856f-d7f9b048d366/volumes" Oct 10 08:40:01 crc kubenswrapper[4732]: I1010 08:40:01.680787 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" path="/var/lib/kubelet/pods/5ae3d1bb-4b16-4730-857c-2f00dd77bcdd/volumes" Oct 10 08:40:12 crc kubenswrapper[4732]: I1010 08:40:12.660283 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:40:12 crc kubenswrapper[4732]: E1010 08:40:12.660994 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:40:27 crc kubenswrapper[4732]: I1010 08:40:27.661108 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:40:27 crc kubenswrapper[4732]: E1010 08:40:27.662179 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:40:37 crc kubenswrapper[4732]: I1010 08:40:37.153308 4732 scope.go:117] "RemoveContainer" containerID="d606e88eff500763c03a15e62df988e8bc219a9846be9e921fbfa4be6ff6ea9e" Oct 10 08:40:37 crc kubenswrapper[4732]: I1010 08:40:37.185759 4732 scope.go:117] "RemoveContainer" containerID="a4a90fb91b9bd6cd3cdc7ace0289d3e8ee8d5240d1fece1347b6063f0953cf52" Oct 10 08:40:37 crc kubenswrapper[4732]: I1010 08:40:37.233176 4732 scope.go:117] "RemoveContainer" containerID="30b23039d67ae40203460f6ebf2c97d7959110e286631e999f57f3efc1a83019" Oct 10 08:40:37 crc kubenswrapper[4732]: I1010 08:40:37.261413 4732 scope.go:117] "RemoveContainer" containerID="cf19aa260e8267381679e3212c8ba4b531c4ed7c945c53b796baef22cdff1558" Oct 10 08:40:37 crc kubenswrapper[4732]: I1010 08:40:37.305537 4732 scope.go:117] "RemoveContainer" containerID="e866a582388f3bb611f09620f0ec81d3a46aabc433451b9110e2d48e73968703" Oct 10 08:40:40 crc kubenswrapper[4732]: I1010 08:40:40.660980 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:40:40 crc kubenswrapper[4732]: E1010 08:40:40.661937 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:40:53 crc kubenswrapper[4732]: I1010 08:40:53.671394 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:40:53 crc kubenswrapper[4732]: E1010 08:40:53.672470 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:41:07 crc kubenswrapper[4732]: I1010 08:41:07.661843 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:41:07 crc kubenswrapper[4732]: E1010 08:41:07.662807 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:41:07 crc kubenswrapper[4732]: I1010 08:41:07.988340 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-djpck"] Oct 10 08:41:07 crc kubenswrapper[4732]: E1010 08:41:07.989156 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="registry-server" Oct 10 08:41:07 crc kubenswrapper[4732]: I1010 08:41:07.989181 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="registry-server" Oct 10 08:41:07 crc kubenswrapper[4732]: E1010 08:41:07.989207 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="extract-content" Oct 10 08:41:07 crc kubenswrapper[4732]: I1010 08:41:07.989217 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="extract-content" Oct 10 08:41:07 crc kubenswrapper[4732]: E1010 08:41:07.989249 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="extract-utilities" Oct 10 08:41:07 crc kubenswrapper[4732]: I1010 08:41:07.989258 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="extract-utilities" Oct 10 08:41:07 crc kubenswrapper[4732]: I1010 08:41:07.989487 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ae3d1bb-4b16-4730-857c-2f00dd77bcdd" containerName="registry-server" Oct 10 08:41:07 crc kubenswrapper[4732]: I1010 08:41:07.991328 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.002418 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-djpck"] Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.077647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-catalog-content\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.077736 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-utilities\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.078108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89g2r\" (UniqueName: \"kubernetes.io/projected/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-kube-api-access-89g2r\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.180419 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89g2r\" (UniqueName: \"kubernetes.io/projected/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-kube-api-access-89g2r\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.180600 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-catalog-content\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.180634 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-utilities\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.181192 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-catalog-content\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.181277 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-utilities\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.203572 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89g2r\" (UniqueName: \"kubernetes.io/projected/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-kube-api-access-89g2r\") pod \"redhat-marketplace-djpck\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.317514 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:08 crc kubenswrapper[4732]: I1010 08:41:08.839994 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-djpck"] Oct 10 08:41:08 crc kubenswrapper[4732]: W1010 08:41:08.848246 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a7ae8a3_21ef_4528_8bc3_418ff1fc363f.slice/crio-5f917dd66e7d7554e48a92e0dd54aa54ededaee8c81454fa7430e7212dbdd117 WatchSource:0}: Error finding container 5f917dd66e7d7554e48a92e0dd54aa54ededaee8c81454fa7430e7212dbdd117: Status 404 returned error can't find the container with id 5f917dd66e7d7554e48a92e0dd54aa54ededaee8c81454fa7430e7212dbdd117 Oct 10 08:41:09 crc kubenswrapper[4732]: I1010 08:41:09.134631 4732 generic.go:334] "Generic (PLEG): container finished" podID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerID="d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90" exitCode=0 Oct 10 08:41:09 crc kubenswrapper[4732]: I1010 08:41:09.134855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djpck" event={"ID":"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f","Type":"ContainerDied","Data":"d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90"} Oct 10 08:41:09 crc kubenswrapper[4732]: I1010 08:41:09.135026 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djpck" event={"ID":"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f","Type":"ContainerStarted","Data":"5f917dd66e7d7554e48a92e0dd54aa54ededaee8c81454fa7430e7212dbdd117"} Oct 10 08:41:10 crc kubenswrapper[4732]: I1010 08:41:10.144546 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djpck" event={"ID":"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f","Type":"ContainerStarted","Data":"895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1"} Oct 10 08:41:11 crc kubenswrapper[4732]: I1010 08:41:11.160418 4732 generic.go:334] "Generic (PLEG): container finished" podID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerID="895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1" exitCode=0 Oct 10 08:41:11 crc kubenswrapper[4732]: I1010 08:41:11.160511 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djpck" event={"ID":"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f","Type":"ContainerDied","Data":"895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1"} Oct 10 08:41:12 crc kubenswrapper[4732]: I1010 08:41:12.174567 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djpck" event={"ID":"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f","Type":"ContainerStarted","Data":"55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9"} Oct 10 08:41:12 crc kubenswrapper[4732]: I1010 08:41:12.203323 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-djpck" podStartSLOduration=2.62820856 podStartE2EDuration="5.203294464s" podCreationTimestamp="2025-10-10 08:41:07 +0000 UTC" firstStartedPulling="2025-10-10 08:41:09.136458942 +0000 UTC m=+6596.206050183" lastFinishedPulling="2025-10-10 08:41:11.711544846 +0000 UTC m=+6598.781136087" observedRunningTime="2025-10-10 08:41:12.192405102 +0000 UTC m=+6599.261996353" watchObservedRunningTime="2025-10-10 08:41:12.203294464 +0000 UTC m=+6599.272885745" Oct 10 08:41:18 crc kubenswrapper[4732]: I1010 08:41:18.318473 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:18 crc kubenswrapper[4732]: I1010 08:41:18.319038 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:18 crc kubenswrapper[4732]: I1010 08:41:18.388309 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:19 crc kubenswrapper[4732]: I1010 08:41:19.327160 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:19 crc kubenswrapper[4732]: I1010 08:41:19.391168 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-djpck"] Oct 10 08:41:19 crc kubenswrapper[4732]: I1010 08:41:19.661352 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:41:19 crc kubenswrapper[4732]: E1010 08:41:19.661898 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.260476 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-djpck" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="registry-server" containerID="cri-o://55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9" gracePeriod=2 Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.788241 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.866593 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89g2r\" (UniqueName: \"kubernetes.io/projected/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-kube-api-access-89g2r\") pod \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.866716 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-utilities\") pod \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.866857 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-catalog-content\") pod \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\" (UID: \"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f\") " Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.868098 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-utilities" (OuterVolumeSpecName: "utilities") pod "9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" (UID: "9a7ae8a3-21ef-4528-8bc3-418ff1fc363f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.871654 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-kube-api-access-89g2r" (OuterVolumeSpecName: "kube-api-access-89g2r") pod "9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" (UID: "9a7ae8a3-21ef-4528-8bc3-418ff1fc363f"). InnerVolumeSpecName "kube-api-access-89g2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.887312 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" (UID: "9a7ae8a3-21ef-4528-8bc3-418ff1fc363f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.969636 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89g2r\" (UniqueName: \"kubernetes.io/projected/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-kube-api-access-89g2r\") on node \"crc\" DevicePath \"\"" Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.969713 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:41:21 crc kubenswrapper[4732]: I1010 08:41:21.969727 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.270672 4732 generic.go:334] "Generic (PLEG): container finished" podID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerID="55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9" exitCode=0 Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.270776 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-djpck" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.270767 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djpck" event={"ID":"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f","Type":"ContainerDied","Data":"55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9"} Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.272054 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-djpck" event={"ID":"9a7ae8a3-21ef-4528-8bc3-418ff1fc363f","Type":"ContainerDied","Data":"5f917dd66e7d7554e48a92e0dd54aa54ededaee8c81454fa7430e7212dbdd117"} Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.272116 4732 scope.go:117] "RemoveContainer" containerID="55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.306230 4732 scope.go:117] "RemoveContainer" containerID="895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.330785 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-djpck"] Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.339775 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-djpck"] Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.350650 4732 scope.go:117] "RemoveContainer" containerID="d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.390264 4732 scope.go:117] "RemoveContainer" containerID="55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9" Oct 10 08:41:22 crc kubenswrapper[4732]: E1010 08:41:22.390682 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9\": container with ID starting with 55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9 not found: ID does not exist" containerID="55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.390735 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9"} err="failed to get container status \"55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9\": rpc error: code = NotFound desc = could not find container \"55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9\": container with ID starting with 55b32463d7dff195e4c8158704d36852a4da57202addc72265f5a652b05e7fc9 not found: ID does not exist" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.390760 4732 scope.go:117] "RemoveContainer" containerID="895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1" Oct 10 08:41:22 crc kubenswrapper[4732]: E1010 08:41:22.391243 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1\": container with ID starting with 895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1 not found: ID does not exist" containerID="895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.391284 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1"} err="failed to get container status \"895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1\": rpc error: code = NotFound desc = could not find container \"895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1\": container with ID starting with 895bd142c4f724c005902e22f965c23290858e3312fd6c0b4897ef01785b63f1 not found: ID does not exist" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.391325 4732 scope.go:117] "RemoveContainer" containerID="d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90" Oct 10 08:41:22 crc kubenswrapper[4732]: E1010 08:41:22.391731 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90\": container with ID starting with d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90 not found: ID does not exist" containerID="d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90" Oct 10 08:41:22 crc kubenswrapper[4732]: I1010 08:41:22.391779 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90"} err="failed to get container status \"d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90\": rpc error: code = NotFound desc = could not find container \"d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90\": container with ID starting with d5a075957b1f4c90263321e46d8cce7c9cc41a3027282b271609dc8a73178d90 not found: ID does not exist" Oct 10 08:41:23 crc kubenswrapper[4732]: I1010 08:41:23.671161 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" path="/var/lib/kubelet/pods/9a7ae8a3-21ef-4528-8bc3-418ff1fc363f/volumes" Oct 10 08:41:30 crc kubenswrapper[4732]: I1010 08:41:30.660205 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:41:31 crc kubenswrapper[4732]: I1010 08:41:31.370344 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"eadb6876f2039302d6f915ccf1dbcd45c2e884e21b68380c16a6038e8cba551e"} Oct 10 08:42:29 crc kubenswrapper[4732]: I1010 08:42:29.049335 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-628n7"] Oct 10 08:42:29 crc kubenswrapper[4732]: I1010 08:42:29.059751 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-628n7"] Oct 10 08:42:29 crc kubenswrapper[4732]: I1010 08:42:29.672718 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d03672-696b-4292-a40d-5b94573eed55" path="/var/lib/kubelet/pods/d0d03672-696b-4292-a40d-5b94573eed55/volumes" Oct 10 08:42:37 crc kubenswrapper[4732]: I1010 08:42:37.448183 4732 scope.go:117] "RemoveContainer" containerID="9def859f4adfd1bc44c46496404ccb26fa55c335b4d23459b98d70dff8cd6c26" Oct 10 08:42:39 crc kubenswrapper[4732]: I1010 08:42:39.057865 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-709f-account-create-gbtdf"] Oct 10 08:42:39 crc kubenswrapper[4732]: I1010 08:42:39.071108 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-709f-account-create-gbtdf"] Oct 10 08:42:39 crc kubenswrapper[4732]: I1010 08:42:39.675177 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00" path="/var/lib/kubelet/pods/69eb5ef8-a699-4c3b-b2be-b7eb9be7ac00/volumes" Oct 10 08:42:53 crc kubenswrapper[4732]: I1010 08:42:53.043460 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-nbjc4"] Oct 10 08:42:53 crc kubenswrapper[4732]: I1010 08:42:53.051721 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-nbjc4"] Oct 10 08:42:53 crc kubenswrapper[4732]: I1010 08:42:53.675057 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df0b5c8-3752-4897-aef3-903211609e38" path="/var/lib/kubelet/pods/5df0b5c8-3752-4897-aef3-903211609e38/volumes" Oct 10 08:43:37 crc kubenswrapper[4732]: I1010 08:43:37.543430 4732 scope.go:117] "RemoveContainer" containerID="cded0d67c97d744d406da3bfcf7e4c5699910effa01d42815242887c7f043bc7" Oct 10 08:43:37 crc kubenswrapper[4732]: I1010 08:43:37.591734 4732 scope.go:117] "RemoveContainer" containerID="b17ff3f177c72c8267d992f014a65debc50db84e79bfc5404cf3457faa207a63" Oct 10 08:43:55 crc kubenswrapper[4732]: I1010 08:43:55.356540 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:43:55 crc kubenswrapper[4732]: I1010 08:43:55.357294 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.303459 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vwsqh"] Oct 10 08:44:14 crc kubenswrapper[4732]: E1010 08:44:14.305954 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="registry-server" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.306060 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="registry-server" Oct 10 08:44:14 crc kubenswrapper[4732]: E1010 08:44:14.306146 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="extract-utilities" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.306216 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="extract-utilities" Oct 10 08:44:14 crc kubenswrapper[4732]: E1010 08:44:14.306303 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="extract-content" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.306368 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="extract-content" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.306676 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7ae8a3-21ef-4528-8bc3-418ff1fc363f" containerName="registry-server" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.308783 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.312417 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwsqh"] Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.342170 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-utilities\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.342260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-catalog-content\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.342286 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrsrq\" (UniqueName: \"kubernetes.io/projected/1478428d-599f-41a1-a73a-8def56378ed9-kube-api-access-wrsrq\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.444178 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-utilities\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.444267 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-catalog-content\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.444291 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrsrq\" (UniqueName: \"kubernetes.io/projected/1478428d-599f-41a1-a73a-8def56378ed9-kube-api-access-wrsrq\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.445129 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-utilities\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.445378 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-catalog-content\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.464202 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrsrq\" (UniqueName: \"kubernetes.io/projected/1478428d-599f-41a1-a73a-8def56378ed9-kube-api-access-wrsrq\") pod \"community-operators-vwsqh\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:14 crc kubenswrapper[4732]: I1010 08:44:14.654356 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:15 crc kubenswrapper[4732]: I1010 08:44:15.165465 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vwsqh"] Oct 10 08:44:16 crc kubenswrapper[4732]: I1010 08:44:16.164172 4732 generic.go:334] "Generic (PLEG): container finished" podID="1478428d-599f-41a1-a73a-8def56378ed9" containerID="7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c" exitCode=0 Oct 10 08:44:16 crc kubenswrapper[4732]: I1010 08:44:16.164274 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwsqh" event={"ID":"1478428d-599f-41a1-a73a-8def56378ed9","Type":"ContainerDied","Data":"7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c"} Oct 10 08:44:16 crc kubenswrapper[4732]: I1010 08:44:16.164501 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwsqh" event={"ID":"1478428d-599f-41a1-a73a-8def56378ed9","Type":"ContainerStarted","Data":"94cff4066f68f690e63ae5b52cba562c23414b074f8f6f108db5adaec16b9023"} Oct 10 08:44:18 crc kubenswrapper[4732]: I1010 08:44:18.181673 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwsqh" event={"ID":"1478428d-599f-41a1-a73a-8def56378ed9","Type":"ContainerStarted","Data":"8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01"} Oct 10 08:44:19 crc kubenswrapper[4732]: I1010 08:44:19.196494 4732 generic.go:334] "Generic (PLEG): container finished" podID="1478428d-599f-41a1-a73a-8def56378ed9" containerID="8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01" exitCode=0 Oct 10 08:44:19 crc kubenswrapper[4732]: I1010 08:44:19.196569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwsqh" event={"ID":"1478428d-599f-41a1-a73a-8def56378ed9","Type":"ContainerDied","Data":"8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01"} Oct 10 08:44:20 crc kubenswrapper[4732]: I1010 08:44:20.208750 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwsqh" event={"ID":"1478428d-599f-41a1-a73a-8def56378ed9","Type":"ContainerStarted","Data":"632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16"} Oct 10 08:44:20 crc kubenswrapper[4732]: I1010 08:44:20.243520 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vwsqh" podStartSLOduration=2.802714001 podStartE2EDuration="6.243498936s" podCreationTimestamp="2025-10-10 08:44:14 +0000 UTC" firstStartedPulling="2025-10-10 08:44:16.167380562 +0000 UTC m=+6783.236971833" lastFinishedPulling="2025-10-10 08:44:19.608165527 +0000 UTC m=+6786.677756768" observedRunningTime="2025-10-10 08:44:20.223610799 +0000 UTC m=+6787.293202070" watchObservedRunningTime="2025-10-10 08:44:20.243498936 +0000 UTC m=+6787.313090177" Oct 10 08:44:24 crc kubenswrapper[4732]: I1010 08:44:24.655185 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:24 crc kubenswrapper[4732]: I1010 08:44:24.655590 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:24 crc kubenswrapper[4732]: I1010 08:44:24.725092 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:25 crc kubenswrapper[4732]: I1010 08:44:25.313809 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:25 crc kubenswrapper[4732]: I1010 08:44:25.355958 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:44:25 crc kubenswrapper[4732]: I1010 08:44:25.356044 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:44:25 crc kubenswrapper[4732]: I1010 08:44:25.367786 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vwsqh"] Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.271710 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vwsqh" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="registry-server" containerID="cri-o://632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16" gracePeriod=2 Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.721992 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.834474 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-catalog-content\") pod \"1478428d-599f-41a1-a73a-8def56378ed9\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.834636 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrsrq\" (UniqueName: \"kubernetes.io/projected/1478428d-599f-41a1-a73a-8def56378ed9-kube-api-access-wrsrq\") pod \"1478428d-599f-41a1-a73a-8def56378ed9\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.834682 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-utilities\") pod \"1478428d-599f-41a1-a73a-8def56378ed9\" (UID: \"1478428d-599f-41a1-a73a-8def56378ed9\") " Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.836062 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-utilities" (OuterVolumeSpecName: "utilities") pod "1478428d-599f-41a1-a73a-8def56378ed9" (UID: "1478428d-599f-41a1-a73a-8def56378ed9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.842015 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1478428d-599f-41a1-a73a-8def56378ed9-kube-api-access-wrsrq" (OuterVolumeSpecName: "kube-api-access-wrsrq") pod "1478428d-599f-41a1-a73a-8def56378ed9" (UID: "1478428d-599f-41a1-a73a-8def56378ed9"). InnerVolumeSpecName "kube-api-access-wrsrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.901291 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1478428d-599f-41a1-a73a-8def56378ed9" (UID: "1478428d-599f-41a1-a73a-8def56378ed9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.936982 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.937037 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrsrq\" (UniqueName: \"kubernetes.io/projected/1478428d-599f-41a1-a73a-8def56378ed9-kube-api-access-wrsrq\") on node \"crc\" DevicePath \"\"" Oct 10 08:44:27 crc kubenswrapper[4732]: I1010 08:44:27.937049 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1478428d-599f-41a1-a73a-8def56378ed9-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.283463 4732 generic.go:334] "Generic (PLEG): container finished" podID="1478428d-599f-41a1-a73a-8def56378ed9" containerID="632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16" exitCode=0 Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.283501 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwsqh" event={"ID":"1478428d-599f-41a1-a73a-8def56378ed9","Type":"ContainerDied","Data":"632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16"} Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.283762 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vwsqh" event={"ID":"1478428d-599f-41a1-a73a-8def56378ed9","Type":"ContainerDied","Data":"94cff4066f68f690e63ae5b52cba562c23414b074f8f6f108db5adaec16b9023"} Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.283603 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vwsqh" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.283779 4732 scope.go:117] "RemoveContainer" containerID="632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.305893 4732 scope.go:117] "RemoveContainer" containerID="8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.317556 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vwsqh"] Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.328793 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vwsqh"] Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.349781 4732 scope.go:117] "RemoveContainer" containerID="7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.392034 4732 scope.go:117] "RemoveContainer" containerID="632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16" Oct 10 08:44:28 crc kubenswrapper[4732]: E1010 08:44:28.393725 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16\": container with ID starting with 632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16 not found: ID does not exist" containerID="632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.393775 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16"} err="failed to get container status \"632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16\": rpc error: code = NotFound desc = could not find container \"632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16\": container with ID starting with 632be52c2785e7aa647d28d2fe596ca16e0f7f9d7e151481cb38df3df0e27e16 not found: ID does not exist" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.393806 4732 scope.go:117] "RemoveContainer" containerID="8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01" Oct 10 08:44:28 crc kubenswrapper[4732]: E1010 08:44:28.394170 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01\": container with ID starting with 8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01 not found: ID does not exist" containerID="8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.394191 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01"} err="failed to get container status \"8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01\": rpc error: code = NotFound desc = could not find container \"8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01\": container with ID starting with 8c0625aa6005655c2d3ee241f6f1551609f8591c2abf6acb2a87414fdbe80a01 not found: ID does not exist" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.394210 4732 scope.go:117] "RemoveContainer" containerID="7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c" Oct 10 08:44:28 crc kubenswrapper[4732]: E1010 08:44:28.394477 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c\": container with ID starting with 7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c not found: ID does not exist" containerID="7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c" Oct 10 08:44:28 crc kubenswrapper[4732]: I1010 08:44:28.394509 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c"} err="failed to get container status \"7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c\": rpc error: code = NotFound desc = could not find container \"7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c\": container with ID starting with 7d7eb98d91ba7ea50410ebb20f72ca262ad8135e813d80ac8aebed5d7a06251c not found: ID does not exist" Oct 10 08:44:29 crc kubenswrapper[4732]: I1010 08:44:29.675636 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1478428d-599f-41a1-a73a-8def56378ed9" path="/var/lib/kubelet/pods/1478428d-599f-41a1-a73a-8def56378ed9/volumes" Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.356720 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.357370 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.357423 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.359034 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eadb6876f2039302d6f915ccf1dbcd45c2e884e21b68380c16a6038e8cba551e"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.359109 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://eadb6876f2039302d6f915ccf1dbcd45c2e884e21b68380c16a6038e8cba551e" gracePeriod=600 Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.565034 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="eadb6876f2039302d6f915ccf1dbcd45c2e884e21b68380c16a6038e8cba551e" exitCode=0 Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.565109 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"eadb6876f2039302d6f915ccf1dbcd45c2e884e21b68380c16a6038e8cba551e"} Oct 10 08:44:55 crc kubenswrapper[4732]: I1010 08:44:55.565421 4732 scope.go:117] "RemoveContainer" containerID="1778eef4d2117250d08706234ffe26e7e11a33fbf78ba5ed46edc461cc8dacff" Oct 10 08:44:56 crc kubenswrapper[4732]: I1010 08:44:56.578911 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b"} Oct 10 08:44:58 crc kubenswrapper[4732]: I1010 08:44:58.969067 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5jx66"] Oct 10 08:44:58 crc kubenswrapper[4732]: E1010 08:44:58.970361 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="registry-server" Oct 10 08:44:58 crc kubenswrapper[4732]: I1010 08:44:58.970384 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="registry-server" Oct 10 08:44:58 crc kubenswrapper[4732]: E1010 08:44:58.970429 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="extract-content" Oct 10 08:44:58 crc kubenswrapper[4732]: I1010 08:44:58.970441 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="extract-content" Oct 10 08:44:58 crc kubenswrapper[4732]: E1010 08:44:58.970471 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="extract-utilities" Oct 10 08:44:58 crc kubenswrapper[4732]: I1010 08:44:58.970484 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="extract-utilities" Oct 10 08:44:58 crc kubenswrapper[4732]: I1010 08:44:58.970885 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1478428d-599f-41a1-a73a-8def56378ed9" containerName="registry-server" Oct 10 08:44:58 crc kubenswrapper[4732]: I1010 08:44:58.973540 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:58 crc kubenswrapper[4732]: I1010 08:44:58.983149 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jx66"] Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.039309 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-utilities\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.039358 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfltb\" (UniqueName: \"kubernetes.io/projected/5063edf7-26ec-42cb-957a-344df05c7b17-kube-api-access-dfltb\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.039397 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-catalog-content\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.141281 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-utilities\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.141344 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfltb\" (UniqueName: \"kubernetes.io/projected/5063edf7-26ec-42cb-957a-344df05c7b17-kube-api-access-dfltb\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.141392 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-catalog-content\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.141828 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-utilities\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.141885 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-catalog-content\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.160999 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfltb\" (UniqueName: \"kubernetes.io/projected/5063edf7-26ec-42cb-957a-344df05c7b17-kube-api-access-dfltb\") pod \"certified-operators-5jx66\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.293463 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:44:59 crc kubenswrapper[4732]: I1010 08:44:59.854374 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jx66"] Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.170521 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj"] Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.172316 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.175017 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.177223 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.188162 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj"] Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.271786 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62841533-3fab-4b0b-a51f-a9afff879bba-config-volume\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.272093 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62841533-3fab-4b0b-a51f-a9afff879bba-secret-volume\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.272177 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8wh\" (UniqueName: \"kubernetes.io/projected/62841533-3fab-4b0b-a51f-a9afff879bba-kube-api-access-vx8wh\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.374799 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62841533-3fab-4b0b-a51f-a9afff879bba-secret-volume\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.374908 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8wh\" (UniqueName: \"kubernetes.io/projected/62841533-3fab-4b0b-a51f-a9afff879bba-kube-api-access-vx8wh\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.375028 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62841533-3fab-4b0b-a51f-a9afff879bba-config-volume\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.375813 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62841533-3fab-4b0b-a51f-a9afff879bba-config-volume\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.381081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62841533-3fab-4b0b-a51f-a9afff879bba-secret-volume\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.392206 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8wh\" (UniqueName: \"kubernetes.io/projected/62841533-3fab-4b0b-a51f-a9afff879bba-kube-api-access-vx8wh\") pod \"collect-profiles-29334765-pbjwj\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.495679 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.633549 4732 generic.go:334] "Generic (PLEG): container finished" podID="5063edf7-26ec-42cb-957a-344df05c7b17" containerID="ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0" exitCode=0 Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.633715 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jx66" event={"ID":"5063edf7-26ec-42cb-957a-344df05c7b17","Type":"ContainerDied","Data":"ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0"} Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.633741 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jx66" event={"ID":"5063edf7-26ec-42cb-957a-344df05c7b17","Type":"ContainerStarted","Data":"92455f5164b59a15eb43c312732c97374e823895414acddd0f211c92966eb8ae"} Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.637664 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:45:00 crc kubenswrapper[4732]: I1010 08:45:00.948397 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj"] Oct 10 08:45:00 crc kubenswrapper[4732]: W1010 08:45:00.954408 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62841533_3fab_4b0b_a51f_a9afff879bba.slice/crio-1fdfa464dafe3c0e3dba01df091685ff6bbcbfa82bcfb31c52e854750a6c245a WatchSource:0}: Error finding container 1fdfa464dafe3c0e3dba01df091685ff6bbcbfa82bcfb31c52e854750a6c245a: Status 404 returned error can't find the container with id 1fdfa464dafe3c0e3dba01df091685ff6bbcbfa82bcfb31c52e854750a6c245a Oct 10 08:45:01 crc kubenswrapper[4732]: I1010 08:45:01.644997 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" event={"ID":"62841533-3fab-4b0b-a51f-a9afff879bba","Type":"ContainerStarted","Data":"c89d6d547c7d69e0326ec4681c27ad5ee35f8741744e5688593dbd9022c799b1"} Oct 10 08:45:01 crc kubenswrapper[4732]: I1010 08:45:01.645076 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" event={"ID":"62841533-3fab-4b0b-a51f-a9afff879bba","Type":"ContainerStarted","Data":"1fdfa464dafe3c0e3dba01df091685ff6bbcbfa82bcfb31c52e854750a6c245a"} Oct 10 08:45:01 crc kubenswrapper[4732]: I1010 08:45:01.666324 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" podStartSLOduration=1.666308221 podStartE2EDuration="1.666308221s" podCreationTimestamp="2025-10-10 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 08:45:01.659896658 +0000 UTC m=+6828.729487899" watchObservedRunningTime="2025-10-10 08:45:01.666308221 +0000 UTC m=+6828.735899462" Oct 10 08:45:02 crc kubenswrapper[4732]: I1010 08:45:02.655336 4732 generic.go:334] "Generic (PLEG): container finished" podID="62841533-3fab-4b0b-a51f-a9afff879bba" containerID="c89d6d547c7d69e0326ec4681c27ad5ee35f8741744e5688593dbd9022c799b1" exitCode=0 Oct 10 08:45:02 crc kubenswrapper[4732]: I1010 08:45:02.655414 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" event={"ID":"62841533-3fab-4b0b-a51f-a9afff879bba","Type":"ContainerDied","Data":"c89d6d547c7d69e0326ec4681c27ad5ee35f8741744e5688593dbd9022c799b1"} Oct 10 08:45:02 crc kubenswrapper[4732]: I1010 08:45:02.657653 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jx66" event={"ID":"5063edf7-26ec-42cb-957a-344df05c7b17","Type":"ContainerStarted","Data":"802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962"} Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.314201 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.357574 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62841533-3fab-4b0b-a51f-a9afff879bba-config-volume\") pod \"62841533-3fab-4b0b-a51f-a9afff879bba\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.357716 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62841533-3fab-4b0b-a51f-a9afff879bba-secret-volume\") pod \"62841533-3fab-4b0b-a51f-a9afff879bba\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.357837 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx8wh\" (UniqueName: \"kubernetes.io/projected/62841533-3fab-4b0b-a51f-a9afff879bba-kube-api-access-vx8wh\") pod \"62841533-3fab-4b0b-a51f-a9afff879bba\" (UID: \"62841533-3fab-4b0b-a51f-a9afff879bba\") " Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.358433 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62841533-3fab-4b0b-a51f-a9afff879bba-config-volume" (OuterVolumeSpecName: "config-volume") pod "62841533-3fab-4b0b-a51f-a9afff879bba" (UID: "62841533-3fab-4b0b-a51f-a9afff879bba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.358724 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62841533-3fab-4b0b-a51f-a9afff879bba-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.365122 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62841533-3fab-4b0b-a51f-a9afff879bba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "62841533-3fab-4b0b-a51f-a9afff879bba" (UID: "62841533-3fab-4b0b-a51f-a9afff879bba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.365986 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62841533-3fab-4b0b-a51f-a9afff879bba-kube-api-access-vx8wh" (OuterVolumeSpecName: "kube-api-access-vx8wh") pod "62841533-3fab-4b0b-a51f-a9afff879bba" (UID: "62841533-3fab-4b0b-a51f-a9afff879bba"). InnerVolumeSpecName "kube-api-access-vx8wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.460302 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62841533-3fab-4b0b-a51f-a9afff879bba-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.460355 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx8wh\" (UniqueName: \"kubernetes.io/projected/62841533-3fab-4b0b-a51f-a9afff879bba-kube-api-access-vx8wh\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.677377 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" event={"ID":"62841533-3fab-4b0b-a51f-a9afff879bba","Type":"ContainerDied","Data":"1fdfa464dafe3c0e3dba01df091685ff6bbcbfa82bcfb31c52e854750a6c245a"} Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.677411 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.677416 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fdfa464dafe3c0e3dba01df091685ff6bbcbfa82bcfb31c52e854750a6c245a" Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.679438 4732 generic.go:334] "Generic (PLEG): container finished" podID="5063edf7-26ec-42cb-957a-344df05c7b17" containerID="802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962" exitCode=0 Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.679483 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jx66" event={"ID":"5063edf7-26ec-42cb-957a-344df05c7b17","Type":"ContainerDied","Data":"802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962"} Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.750645 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx"] Oct 10 08:45:04 crc kubenswrapper[4732]: I1010 08:45:04.758883 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334720-k2tdx"] Oct 10 08:45:05 crc kubenswrapper[4732]: I1010 08:45:05.674682 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03988de-67cd-458f-98e5-8913168773f4" path="/var/lib/kubelet/pods/d03988de-67cd-458f-98e5-8913168773f4/volumes" Oct 10 08:45:05 crc kubenswrapper[4732]: I1010 08:45:05.694237 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jx66" event={"ID":"5063edf7-26ec-42cb-957a-344df05c7b17","Type":"ContainerStarted","Data":"82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4"} Oct 10 08:45:05 crc kubenswrapper[4732]: I1010 08:45:05.696408 4732 generic.go:334] "Generic (PLEG): container finished" podID="2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" containerID="118559cb653174b800879919597672796acf9fcac4227e239057ccd9bb10f369" exitCode=0 Oct 10 08:45:05 crc kubenswrapper[4732]: I1010 08:45:05.696464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" event={"ID":"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19","Type":"ContainerDied","Data":"118559cb653174b800879919597672796acf9fcac4227e239057ccd9bb10f369"} Oct 10 08:45:05 crc kubenswrapper[4732]: I1010 08:45:05.714258 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5jx66" podStartSLOduration=3.174524007 podStartE2EDuration="7.714242824s" podCreationTimestamp="2025-10-10 08:44:58 +0000 UTC" firstStartedPulling="2025-10-10 08:45:00.635664632 +0000 UTC m=+6827.705255873" lastFinishedPulling="2025-10-10 08:45:05.175383419 +0000 UTC m=+6832.244974690" observedRunningTime="2025-10-10 08:45:05.713080073 +0000 UTC m=+6832.782671324" watchObservedRunningTime="2025-10-10 08:45:05.714242824 +0000 UTC m=+6832.783834065" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.140941 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.220075 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-inventory\") pod \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.220214 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f24fk\" (UniqueName: \"kubernetes.io/projected/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-kube-api-access-f24fk\") pod \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.220430 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-ssh-key\") pod \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.220487 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-tripleo-cleanup-combined-ca-bundle\") pod \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\" (UID: \"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19\") " Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.226125 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-kube-api-access-f24fk" (OuterVolumeSpecName: "kube-api-access-f24fk") pod "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" (UID: "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19"). InnerVolumeSpecName "kube-api-access-f24fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.226429 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" (UID: "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.265867 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" (UID: "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.269384 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-inventory" (OuterVolumeSpecName: "inventory") pod "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" (UID: "2aeb77fe-3b78-467f-a4c1-c383d1ad4d19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.322959 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.323211 4732 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.323329 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.323410 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f24fk\" (UniqueName: \"kubernetes.io/projected/2aeb77fe-3b78-467f-a4c1-c383d1ad4d19-kube-api-access-f24fk\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.721742 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.721767 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq" event={"ID":"2aeb77fe-3b78-467f-a4c1-c383d1ad4d19","Type":"ContainerDied","Data":"2b9c7ab4fcd5911c75072fda4430a5b3f9da03e1fb960f84767821a6744afbca"} Oct 10 08:45:07 crc kubenswrapper[4732]: I1010 08:45:07.721894 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b9c7ab4fcd5911c75072fda4430a5b3f9da03e1fb960f84767821a6744afbca" Oct 10 08:45:09 crc kubenswrapper[4732]: I1010 08:45:09.293811 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:45:09 crc kubenswrapper[4732]: I1010 08:45:09.294091 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:45:09 crc kubenswrapper[4732]: I1010 08:45:09.347205 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.030289 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-qw2mn"] Oct 10 08:45:16 crc kubenswrapper[4732]: E1010 08:45:16.031909 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.031980 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 10 08:45:16 crc kubenswrapper[4732]: E1010 08:45:16.032048 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62841533-3fab-4b0b-a51f-a9afff879bba" containerName="collect-profiles" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.032189 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="62841533-3fab-4b0b-a51f-a9afff879bba" containerName="collect-profiles" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.032475 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="62841533-3fab-4b0b-a51f-a9afff879bba" containerName="collect-profiles" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.032542 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aeb77fe-3b78-467f-a4c1-c383d1ad4d19" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.033353 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.039153 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.040194 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.040453 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.041726 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.048253 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-qw2mn"] Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.118070 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-inventory\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.118165 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.118246 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v49nc\" (UniqueName: \"kubernetes.io/projected/48a3c76e-8709-4c95-b087-4a5d83083c97-kube-api-access-v49nc\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.118317 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.219552 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v49nc\" (UniqueName: \"kubernetes.io/projected/48a3c76e-8709-4c95-b087-4a5d83083c97-kube-api-access-v49nc\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.219884 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.220086 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-inventory\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.220839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.226018 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.226173 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.227358 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-inventory\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.239459 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v49nc\" (UniqueName: \"kubernetes.io/projected/48a3c76e-8709-4c95-b087-4a5d83083c97-kube-api-access-v49nc\") pod \"bootstrap-openstack-openstack-cell1-qw2mn\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.364687 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:45:16 crc kubenswrapper[4732]: I1010 08:45:16.908861 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-qw2mn"] Oct 10 08:45:16 crc kubenswrapper[4732]: W1010 08:45:16.915373 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48a3c76e_8709_4c95_b087_4a5d83083c97.slice/crio-dfaf4fbab134635f2393b138fdc206df8f3e3a737c406c9356b4b876a301a208 WatchSource:0}: Error finding container dfaf4fbab134635f2393b138fdc206df8f3e3a737c406c9356b4b876a301a208: Status 404 returned error can't find the container with id dfaf4fbab134635f2393b138fdc206df8f3e3a737c406c9356b4b876a301a208 Oct 10 08:45:17 crc kubenswrapper[4732]: I1010 08:45:17.822344 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" event={"ID":"48a3c76e-8709-4c95-b087-4a5d83083c97","Type":"ContainerStarted","Data":"77c8402f6c37fb0e74f703dc800162c69dc5e94dc15e349c9cb99d4794090b36"} Oct 10 08:45:17 crc kubenswrapper[4732]: I1010 08:45:17.822821 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" event={"ID":"48a3c76e-8709-4c95-b087-4a5d83083c97","Type":"ContainerStarted","Data":"dfaf4fbab134635f2393b138fdc206df8f3e3a737c406c9356b4b876a301a208"} Oct 10 08:45:17 crc kubenswrapper[4732]: I1010 08:45:17.845550 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" podStartSLOduration=1.350551874 podStartE2EDuration="1.845530995s" podCreationTimestamp="2025-10-10 08:45:16 +0000 UTC" firstStartedPulling="2025-10-10 08:45:16.918037651 +0000 UTC m=+6843.987628892" lastFinishedPulling="2025-10-10 08:45:17.413016772 +0000 UTC m=+6844.482608013" observedRunningTime="2025-10-10 08:45:17.84015063 +0000 UTC m=+6844.909741871" watchObservedRunningTime="2025-10-10 08:45:17.845530995 +0000 UTC m=+6844.915122246" Oct 10 08:45:19 crc kubenswrapper[4732]: I1010 08:45:19.355640 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:45:19 crc kubenswrapper[4732]: I1010 08:45:19.410957 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jx66"] Oct 10 08:45:19 crc kubenswrapper[4732]: I1010 08:45:19.855957 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5jx66" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="registry-server" containerID="cri-o://82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4" gracePeriod=2 Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.313295 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.408355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-utilities\") pod \"5063edf7-26ec-42cb-957a-344df05c7b17\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.408963 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfltb\" (UniqueName: \"kubernetes.io/projected/5063edf7-26ec-42cb-957a-344df05c7b17-kube-api-access-dfltb\") pod \"5063edf7-26ec-42cb-957a-344df05c7b17\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.409193 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-catalog-content\") pod \"5063edf7-26ec-42cb-957a-344df05c7b17\" (UID: \"5063edf7-26ec-42cb-957a-344df05c7b17\") " Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.409387 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-utilities" (OuterVolumeSpecName: "utilities") pod "5063edf7-26ec-42cb-957a-344df05c7b17" (UID: "5063edf7-26ec-42cb-957a-344df05c7b17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.409820 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.414048 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5063edf7-26ec-42cb-957a-344df05c7b17-kube-api-access-dfltb" (OuterVolumeSpecName: "kube-api-access-dfltb") pod "5063edf7-26ec-42cb-957a-344df05c7b17" (UID: "5063edf7-26ec-42cb-957a-344df05c7b17"). InnerVolumeSpecName "kube-api-access-dfltb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.453867 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5063edf7-26ec-42cb-957a-344df05c7b17" (UID: "5063edf7-26ec-42cb-957a-344df05c7b17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.511139 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfltb\" (UniqueName: \"kubernetes.io/projected/5063edf7-26ec-42cb-957a-344df05c7b17-kube-api-access-dfltb\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.511203 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5063edf7-26ec-42cb-957a-344df05c7b17-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.869025 4732 generic.go:334] "Generic (PLEG): container finished" podID="5063edf7-26ec-42cb-957a-344df05c7b17" containerID="82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4" exitCode=0 Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.869076 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jx66" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.869102 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jx66" event={"ID":"5063edf7-26ec-42cb-957a-344df05c7b17","Type":"ContainerDied","Data":"82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4"} Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.869424 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jx66" event={"ID":"5063edf7-26ec-42cb-957a-344df05c7b17","Type":"ContainerDied","Data":"92455f5164b59a15eb43c312732c97374e823895414acddd0f211c92966eb8ae"} Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.869491 4732 scope.go:117] "RemoveContainer" containerID="82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.890936 4732 scope.go:117] "RemoveContainer" containerID="802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.905449 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jx66"] Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.912958 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5jx66"] Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.949023 4732 scope.go:117] "RemoveContainer" containerID="ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.982110 4732 scope.go:117] "RemoveContainer" containerID="82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4" Oct 10 08:45:20 crc kubenswrapper[4732]: E1010 08:45:20.982640 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4\": container with ID starting with 82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4 not found: ID does not exist" containerID="82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.982678 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4"} err="failed to get container status \"82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4\": rpc error: code = NotFound desc = could not find container \"82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4\": container with ID starting with 82660e47baff2ffeda7f60706a832f1560e08a55b21e3a2f398c5863d391c5b4 not found: ID does not exist" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.982721 4732 scope.go:117] "RemoveContainer" containerID="802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962" Oct 10 08:45:20 crc kubenswrapper[4732]: E1010 08:45:20.983160 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962\": container with ID starting with 802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962 not found: ID does not exist" containerID="802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.983331 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962"} err="failed to get container status \"802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962\": rpc error: code = NotFound desc = could not find container \"802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962\": container with ID starting with 802374c6cfb448ae4492711e149224200724425cd55b3e5dc7c30653339eb962 not found: ID does not exist" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.983478 4732 scope.go:117] "RemoveContainer" containerID="ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0" Oct 10 08:45:20 crc kubenswrapper[4732]: E1010 08:45:20.983989 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0\": container with ID starting with ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0 not found: ID does not exist" containerID="ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0" Oct 10 08:45:20 crc kubenswrapper[4732]: I1010 08:45:20.984023 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0"} err="failed to get container status \"ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0\": rpc error: code = NotFound desc = could not find container \"ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0\": container with ID starting with ba8c026d2c947f94995cc99cee610d0a8e855785aea2771a87f89fe1f1111da0 not found: ID does not exist" Oct 10 08:45:21 crc kubenswrapper[4732]: I1010 08:45:21.672458 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" path="/var/lib/kubelet/pods/5063edf7-26ec-42cb-957a-344df05c7b17/volumes" Oct 10 08:45:37 crc kubenswrapper[4732]: I1010 08:45:37.733326 4732 scope.go:117] "RemoveContainer" containerID="238c2ff99494f459911c656fb1f271494b04aa49f3b0917b158c46854a3ac05b" Oct 10 08:46:55 crc kubenswrapper[4732]: I1010 08:46:55.356605 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:46:55 crc kubenswrapper[4732]: I1010 08:46:55.357230 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:47:25 crc kubenswrapper[4732]: I1010 08:47:25.355877 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:47:25 crc kubenswrapper[4732]: I1010 08:47:25.356683 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:47:55 crc kubenswrapper[4732]: I1010 08:47:55.356270 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:47:55 crc kubenswrapper[4732]: I1010 08:47:55.356843 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:47:55 crc kubenswrapper[4732]: I1010 08:47:55.356896 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:47:55 crc kubenswrapper[4732]: I1010 08:47:55.357719 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:47:55 crc kubenswrapper[4732]: I1010 08:47:55.357805 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" gracePeriod=600 Oct 10 08:47:55 crc kubenswrapper[4732]: E1010 08:47:55.577138 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:47:56 crc kubenswrapper[4732]: I1010 08:47:56.420279 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" exitCode=0 Oct 10 08:47:56 crc kubenswrapper[4732]: I1010 08:47:56.420376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b"} Oct 10 08:47:56 crc kubenswrapper[4732]: I1010 08:47:56.420737 4732 scope.go:117] "RemoveContainer" containerID="eadb6876f2039302d6f915ccf1dbcd45c2e884e21b68380c16a6038e8cba551e" Oct 10 08:47:56 crc kubenswrapper[4732]: I1010 08:47:56.421893 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:47:56 crc kubenswrapper[4732]: E1010 08:47:56.422533 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:48:09 crc kubenswrapper[4732]: I1010 08:48:09.661125 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:48:09 crc kubenswrapper[4732]: E1010 08:48:09.662245 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:48:22 crc kubenswrapper[4732]: I1010 08:48:22.660283 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:48:22 crc kubenswrapper[4732]: E1010 08:48:22.661058 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:48:23 crc kubenswrapper[4732]: I1010 08:48:23.717289 4732 generic.go:334] "Generic (PLEG): container finished" podID="48a3c76e-8709-4c95-b087-4a5d83083c97" containerID="77c8402f6c37fb0e74f703dc800162c69dc5e94dc15e349c9cb99d4794090b36" exitCode=0 Oct 10 08:48:23 crc kubenswrapper[4732]: I1010 08:48:23.717359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" event={"ID":"48a3c76e-8709-4c95-b087-4a5d83083c97","Type":"ContainerDied","Data":"77c8402f6c37fb0e74f703dc800162c69dc5e94dc15e349c9cb99d4794090b36"} Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.221202 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.279638 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-bootstrap-combined-ca-bundle\") pod \"48a3c76e-8709-4c95-b087-4a5d83083c97\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.279806 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v49nc\" (UniqueName: \"kubernetes.io/projected/48a3c76e-8709-4c95-b087-4a5d83083c97-kube-api-access-v49nc\") pod \"48a3c76e-8709-4c95-b087-4a5d83083c97\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.279930 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-ssh-key\") pod \"48a3c76e-8709-4c95-b087-4a5d83083c97\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.279981 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-inventory\") pod \"48a3c76e-8709-4c95-b087-4a5d83083c97\" (UID: \"48a3c76e-8709-4c95-b087-4a5d83083c97\") " Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.303219 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a3c76e-8709-4c95-b087-4a5d83083c97-kube-api-access-v49nc" (OuterVolumeSpecName: "kube-api-access-v49nc") pod "48a3c76e-8709-4c95-b087-4a5d83083c97" (UID: "48a3c76e-8709-4c95-b087-4a5d83083c97"). InnerVolumeSpecName "kube-api-access-v49nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.311400 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "48a3c76e-8709-4c95-b087-4a5d83083c97" (UID: "48a3c76e-8709-4c95-b087-4a5d83083c97"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.324884 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-inventory" (OuterVolumeSpecName: "inventory") pod "48a3c76e-8709-4c95-b087-4a5d83083c97" (UID: "48a3c76e-8709-4c95-b087-4a5d83083c97"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.332077 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "48a3c76e-8709-4c95-b087-4a5d83083c97" (UID: "48a3c76e-8709-4c95-b087-4a5d83083c97"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.391891 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.391977 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.392005 4732 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48a3c76e-8709-4c95-b087-4a5d83083c97-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.392027 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v49nc\" (UniqueName: \"kubernetes.io/projected/48a3c76e-8709-4c95-b087-4a5d83083c97-kube-api-access-v49nc\") on node \"crc\" DevicePath \"\"" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.735643 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" event={"ID":"48a3c76e-8709-4c95-b087-4a5d83083c97","Type":"ContainerDied","Data":"dfaf4fbab134635f2393b138fdc206df8f3e3a737c406c9356b4b876a301a208"} Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.735721 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfaf4fbab134635f2393b138fdc206df8f3e3a737c406c9356b4b876a301a208" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.736211 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-qw2mn" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.845421 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-26bh4"] Oct 10 08:48:25 crc kubenswrapper[4732]: E1010 08:48:25.846092 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="extract-utilities" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.846114 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="extract-utilities" Oct 10 08:48:25 crc kubenswrapper[4732]: E1010 08:48:25.846132 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="registry-server" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.846139 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="registry-server" Oct 10 08:48:25 crc kubenswrapper[4732]: E1010 08:48:25.846165 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="extract-content" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.846171 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="extract-content" Oct 10 08:48:25 crc kubenswrapper[4732]: E1010 08:48:25.846177 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a3c76e-8709-4c95-b087-4a5d83083c97" containerName="bootstrap-openstack-openstack-cell1" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.846183 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a3c76e-8709-4c95-b087-4a5d83083c97" containerName="bootstrap-openstack-openstack-cell1" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.846360 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5063edf7-26ec-42cb-957a-344df05c7b17" containerName="registry-server" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.846375 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a3c76e-8709-4c95-b087-4a5d83083c97" containerName="bootstrap-openstack-openstack-cell1" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.847106 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.855341 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.855378 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.855352 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.855563 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.861082 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-26bh4"] Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.903740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbn7h\" (UniqueName: \"kubernetes.io/projected/6a758950-7ac7-433f-8d14-1a39efae029b-kube-api-access-dbn7h\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.904748 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:25 crc kubenswrapper[4732]: I1010 08:48:25.904805 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-inventory\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.007378 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-inventory\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.007435 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.007593 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbn7h\" (UniqueName: \"kubernetes.io/projected/6a758950-7ac7-433f-8d14-1a39efae029b-kube-api-access-dbn7h\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.012077 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-ssh-key\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.012088 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-inventory\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.027968 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbn7h\" (UniqueName: \"kubernetes.io/projected/6a758950-7ac7-433f-8d14-1a39efae029b-kube-api-access-dbn7h\") pod \"download-cache-openstack-openstack-cell1-26bh4\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.176856 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:48:26 crc kubenswrapper[4732]: I1010 08:48:26.764323 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-26bh4"] Oct 10 08:48:27 crc kubenswrapper[4732]: I1010 08:48:27.763870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" event={"ID":"6a758950-7ac7-433f-8d14-1a39efae029b","Type":"ContainerStarted","Data":"c94818f8104808ac88b8d127ce9bc72fc5008f71965c49699572ac59fe2a8db7"} Oct 10 08:48:28 crc kubenswrapper[4732]: I1010 08:48:28.772749 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" event={"ID":"6a758950-7ac7-433f-8d14-1a39efae029b","Type":"ContainerStarted","Data":"ff23e7e4795f69efc540b7e9cb944c67fff982a74e8aaa43b58dd8ad2670e580"} Oct 10 08:48:28 crc kubenswrapper[4732]: I1010 08:48:28.796724 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" podStartSLOduration=3.019597153 podStartE2EDuration="3.796705258s" podCreationTimestamp="2025-10-10 08:48:25 +0000 UTC" firstStartedPulling="2025-10-10 08:48:26.774257868 +0000 UTC m=+7033.843849109" lastFinishedPulling="2025-10-10 08:48:27.551365973 +0000 UTC m=+7034.620957214" observedRunningTime="2025-10-10 08:48:28.792642928 +0000 UTC m=+7035.862234169" watchObservedRunningTime="2025-10-10 08:48:28.796705258 +0000 UTC m=+7035.866296499" Oct 10 08:48:34 crc kubenswrapper[4732]: I1010 08:48:34.660153 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:48:34 crc kubenswrapper[4732]: E1010 08:48:34.660995 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:48:48 crc kubenswrapper[4732]: I1010 08:48:48.661131 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:48:48 crc kubenswrapper[4732]: E1010 08:48:48.662148 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:49:01 crc kubenswrapper[4732]: I1010 08:49:01.660482 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:49:01 crc kubenswrapper[4732]: E1010 08:49:01.661361 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:49:16 crc kubenswrapper[4732]: I1010 08:49:16.661410 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:49:16 crc kubenswrapper[4732]: E1010 08:49:16.662260 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:49:31 crc kubenswrapper[4732]: I1010 08:49:31.659737 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:49:31 crc kubenswrapper[4732]: E1010 08:49:31.660475 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:49:45 crc kubenswrapper[4732]: I1010 08:49:45.661210 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:49:45 crc kubenswrapper[4732]: E1010 08:49:45.662249 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:49:56 crc kubenswrapper[4732]: I1010 08:49:56.660796 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:49:56 crc kubenswrapper[4732]: E1010 08:49:56.661759 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:50:07 crc kubenswrapper[4732]: I1010 08:50:07.816455 4732 generic.go:334] "Generic (PLEG): container finished" podID="6a758950-7ac7-433f-8d14-1a39efae029b" containerID="ff23e7e4795f69efc540b7e9cb944c67fff982a74e8aaa43b58dd8ad2670e580" exitCode=0 Oct 10 08:50:07 crc kubenswrapper[4732]: I1010 08:50:07.816510 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" event={"ID":"6a758950-7ac7-433f-8d14-1a39efae029b","Type":"ContainerDied","Data":"ff23e7e4795f69efc540b7e9cb944c67fff982a74e8aaa43b58dd8ad2670e580"} Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.370487 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.401011 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-inventory\") pod \"6a758950-7ac7-433f-8d14-1a39efae029b\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.401128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbn7h\" (UniqueName: \"kubernetes.io/projected/6a758950-7ac7-433f-8d14-1a39efae029b-kube-api-access-dbn7h\") pod \"6a758950-7ac7-433f-8d14-1a39efae029b\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.401200 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-ssh-key\") pod \"6a758950-7ac7-433f-8d14-1a39efae029b\" (UID: \"6a758950-7ac7-433f-8d14-1a39efae029b\") " Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.415645 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a758950-7ac7-433f-8d14-1a39efae029b-kube-api-access-dbn7h" (OuterVolumeSpecName: "kube-api-access-dbn7h") pod "6a758950-7ac7-433f-8d14-1a39efae029b" (UID: "6a758950-7ac7-433f-8d14-1a39efae029b"). InnerVolumeSpecName "kube-api-access-dbn7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.428802 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-inventory" (OuterVolumeSpecName: "inventory") pod "6a758950-7ac7-433f-8d14-1a39efae029b" (UID: "6a758950-7ac7-433f-8d14-1a39efae029b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.451682 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6a758950-7ac7-433f-8d14-1a39efae029b" (UID: "6a758950-7ac7-433f-8d14-1a39efae029b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.503400 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.503452 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbn7h\" (UniqueName: \"kubernetes.io/projected/6a758950-7ac7-433f-8d14-1a39efae029b-kube-api-access-dbn7h\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.503465 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6a758950-7ac7-433f-8d14-1a39efae029b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.842932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" event={"ID":"6a758950-7ac7-433f-8d14-1a39efae029b","Type":"ContainerDied","Data":"c94818f8104808ac88b8d127ce9bc72fc5008f71965c49699572ac59fe2a8db7"} Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.843283 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c94818f8104808ac88b8d127ce9bc72fc5008f71965c49699572ac59fe2a8db7" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.843031 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-26bh4" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.924734 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-845gd"] Oct 10 08:50:09 crc kubenswrapper[4732]: E1010 08:50:09.925114 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a758950-7ac7-433f-8d14-1a39efae029b" containerName="download-cache-openstack-openstack-cell1" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.925131 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a758950-7ac7-433f-8d14-1a39efae029b" containerName="download-cache-openstack-openstack-cell1" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.925302 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a758950-7ac7-433f-8d14-1a39efae029b" containerName="download-cache-openstack-openstack-cell1" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.926019 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.928607 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.928793 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.928842 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.929033 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:50:09 crc kubenswrapper[4732]: I1010 08:50:09.955003 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-845gd"] Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.016322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-ssh-key\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.016605 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-inventory\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.016759 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb47b\" (UniqueName: \"kubernetes.io/projected/f58d836d-542b-46f8-8446-2ab8874fb834-kube-api-access-jb47b\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.119409 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-inventory\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.119507 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb47b\" (UniqueName: \"kubernetes.io/projected/f58d836d-542b-46f8-8446-2ab8874fb834-kube-api-access-jb47b\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.119598 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-ssh-key\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.124399 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-inventory\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.125380 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-ssh-key\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.150292 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb47b\" (UniqueName: \"kubernetes.io/projected/f58d836d-542b-46f8-8446-2ab8874fb834-kube-api-access-jb47b\") pod \"configure-network-openstack-openstack-cell1-845gd\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.262077 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.824417 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-845gd"] Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.840572 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:50:10 crc kubenswrapper[4732]: I1010 08:50:10.858202 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-845gd" event={"ID":"f58d836d-542b-46f8-8446-2ab8874fb834","Type":"ContainerStarted","Data":"3947d395a15465e1ad720ab79a2c1f9be49855b2b2784eb803afdba389296159"} Oct 10 08:50:11 crc kubenswrapper[4732]: I1010 08:50:11.660025 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:50:11 crc kubenswrapper[4732]: E1010 08:50:11.660506 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:50:11 crc kubenswrapper[4732]: I1010 08:50:11.867505 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-845gd" event={"ID":"f58d836d-542b-46f8-8446-2ab8874fb834","Type":"ContainerStarted","Data":"f09d0a6762b88f0762184ea1914715f0ecc126cb3164b703726ed89a4b1799b0"} Oct 10 08:50:11 crc kubenswrapper[4732]: I1010 08:50:11.886618 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-845gd" podStartSLOduration=2.345431458 podStartE2EDuration="2.886598925s" podCreationTimestamp="2025-10-10 08:50:09 +0000 UTC" firstStartedPulling="2025-10-10 08:50:10.840000015 +0000 UTC m=+7137.909591296" lastFinishedPulling="2025-10-10 08:50:11.381167512 +0000 UTC m=+7138.450758763" observedRunningTime="2025-10-10 08:50:11.881329642 +0000 UTC m=+7138.950920883" watchObservedRunningTime="2025-10-10 08:50:11.886598925 +0000 UTC m=+7138.956190186" Oct 10 08:50:22 crc kubenswrapper[4732]: I1010 08:50:22.660907 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:50:22 crc kubenswrapper[4732]: E1010 08:50:22.662102 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:50:37 crc kubenswrapper[4732]: I1010 08:50:37.660248 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:50:37 crc kubenswrapper[4732]: E1010 08:50:37.660924 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:50:49 crc kubenswrapper[4732]: I1010 08:50:49.662549 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:50:49 crc kubenswrapper[4732]: E1010 08:50:49.663813 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:51:02 crc kubenswrapper[4732]: I1010 08:51:02.660544 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:51:02 crc kubenswrapper[4732]: E1010 08:51:02.661337 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:51:17 crc kubenswrapper[4732]: I1010 08:51:17.660311 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:51:17 crc kubenswrapper[4732]: E1010 08:51:17.661235 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:51:29 crc kubenswrapper[4732]: I1010 08:51:29.661163 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:51:29 crc kubenswrapper[4732]: E1010 08:51:29.662110 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:51:30 crc kubenswrapper[4732]: I1010 08:51:30.766470 4732 generic.go:334] "Generic (PLEG): container finished" podID="f58d836d-542b-46f8-8446-2ab8874fb834" containerID="f09d0a6762b88f0762184ea1914715f0ecc126cb3164b703726ed89a4b1799b0" exitCode=0 Oct 10 08:51:30 crc kubenswrapper[4732]: I1010 08:51:30.766528 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-845gd" event={"ID":"f58d836d-542b-46f8-8446-2ab8874fb834","Type":"ContainerDied","Data":"f09d0a6762b88f0762184ea1914715f0ecc126cb3164b703726ed89a4b1799b0"} Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.270446 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.361485 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-inventory\") pod \"f58d836d-542b-46f8-8446-2ab8874fb834\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.361586 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb47b\" (UniqueName: \"kubernetes.io/projected/f58d836d-542b-46f8-8446-2ab8874fb834-kube-api-access-jb47b\") pod \"f58d836d-542b-46f8-8446-2ab8874fb834\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.361671 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-ssh-key\") pod \"f58d836d-542b-46f8-8446-2ab8874fb834\" (UID: \"f58d836d-542b-46f8-8446-2ab8874fb834\") " Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.368294 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58d836d-542b-46f8-8446-2ab8874fb834-kube-api-access-jb47b" (OuterVolumeSpecName: "kube-api-access-jb47b") pod "f58d836d-542b-46f8-8446-2ab8874fb834" (UID: "f58d836d-542b-46f8-8446-2ab8874fb834"). InnerVolumeSpecName "kube-api-access-jb47b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.400860 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-inventory" (OuterVolumeSpecName: "inventory") pod "f58d836d-542b-46f8-8446-2ab8874fb834" (UID: "f58d836d-542b-46f8-8446-2ab8874fb834"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.419667 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f58d836d-542b-46f8-8446-2ab8874fb834" (UID: "f58d836d-542b-46f8-8446-2ab8874fb834"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.464735 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.464786 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb47b\" (UniqueName: \"kubernetes.io/projected/f58d836d-542b-46f8-8446-2ab8874fb834-kube-api-access-jb47b\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.464807 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f58d836d-542b-46f8-8446-2ab8874fb834-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.795480 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-845gd" event={"ID":"f58d836d-542b-46f8-8446-2ab8874fb834","Type":"ContainerDied","Data":"3947d395a15465e1ad720ab79a2c1f9be49855b2b2784eb803afdba389296159"} Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.795882 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3947d395a15465e1ad720ab79a2c1f9be49855b2b2784eb803afdba389296159" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.795575 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-845gd" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.946463 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-8qfmp"] Oct 10 08:51:32 crc kubenswrapper[4732]: E1010 08:51:32.946850 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58d836d-542b-46f8-8446-2ab8874fb834" containerName="configure-network-openstack-openstack-cell1" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.946867 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58d836d-542b-46f8-8446-2ab8874fb834" containerName="configure-network-openstack-openstack-cell1" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.947028 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58d836d-542b-46f8-8446-2ab8874fb834" containerName="configure-network-openstack-openstack-cell1" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.947842 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.953796 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.953796 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.955182 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.962379 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-8qfmp"] Oct 10 08:51:32 crc kubenswrapper[4732]: I1010 08:51:32.964902 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.077140 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwgqp\" (UniqueName: \"kubernetes.io/projected/deae8388-32f8-45a9-959e-a0b6e758d873-kube-api-access-jwgqp\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.077201 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-ssh-key\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.077360 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-inventory\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.179607 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwgqp\" (UniqueName: \"kubernetes.io/projected/deae8388-32f8-45a9-959e-a0b6e758d873-kube-api-access-jwgqp\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.179756 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-ssh-key\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.179878 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-inventory\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.185231 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-ssh-key\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.192503 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-inventory\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.198641 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwgqp\" (UniqueName: \"kubernetes.io/projected/deae8388-32f8-45a9-959e-a0b6e758d873-kube-api-access-jwgqp\") pod \"validate-network-openstack-openstack-cell1-8qfmp\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.270290 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:33 crc kubenswrapper[4732]: I1010 08:51:33.829576 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-8qfmp"] Oct 10 08:51:34 crc kubenswrapper[4732]: I1010 08:51:34.831947 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" event={"ID":"deae8388-32f8-45a9-959e-a0b6e758d873","Type":"ContainerStarted","Data":"cee8ce54385c6df1a7d5a9f6ab210918056bef89130a6e1122b6777dca6ce4ed"} Oct 10 08:51:34 crc kubenswrapper[4732]: I1010 08:51:34.831988 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" event={"ID":"deae8388-32f8-45a9-959e-a0b6e758d873","Type":"ContainerStarted","Data":"580cb74ebcf76535930c3d5b28bfa78151cc226dc880884e4d27cff86b40f680"} Oct 10 08:51:34 crc kubenswrapper[4732]: I1010 08:51:34.854785 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" podStartSLOduration=2.182200386 podStartE2EDuration="2.85475913s" podCreationTimestamp="2025-10-10 08:51:32 +0000 UTC" firstStartedPulling="2025-10-10 08:51:33.823788942 +0000 UTC m=+7220.893380183" lastFinishedPulling="2025-10-10 08:51:34.496347686 +0000 UTC m=+7221.565938927" observedRunningTime="2025-10-10 08:51:34.846175998 +0000 UTC m=+7221.915767259" watchObservedRunningTime="2025-10-10 08:51:34.85475913 +0000 UTC m=+7221.924350381" Oct 10 08:51:40 crc kubenswrapper[4732]: I1010 08:51:40.659928 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:51:40 crc kubenswrapper[4732]: E1010 08:51:40.660556 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:51:40 crc kubenswrapper[4732]: I1010 08:51:40.899189 4732 generic.go:334] "Generic (PLEG): container finished" podID="deae8388-32f8-45a9-959e-a0b6e758d873" containerID="cee8ce54385c6df1a7d5a9f6ab210918056bef89130a6e1122b6777dca6ce4ed" exitCode=0 Oct 10 08:51:40 crc kubenswrapper[4732]: I1010 08:51:40.899255 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" event={"ID":"deae8388-32f8-45a9-959e-a0b6e758d873","Type":"ContainerDied","Data":"cee8ce54385c6df1a7d5a9f6ab210918056bef89130a6e1122b6777dca6ce4ed"} Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.346913 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.400077 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-inventory\") pod \"deae8388-32f8-45a9-959e-a0b6e758d873\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.400176 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-ssh-key\") pod \"deae8388-32f8-45a9-959e-a0b6e758d873\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.400270 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwgqp\" (UniqueName: \"kubernetes.io/projected/deae8388-32f8-45a9-959e-a0b6e758d873-kube-api-access-jwgqp\") pod \"deae8388-32f8-45a9-959e-a0b6e758d873\" (UID: \"deae8388-32f8-45a9-959e-a0b6e758d873\") " Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.408494 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deae8388-32f8-45a9-959e-a0b6e758d873-kube-api-access-jwgqp" (OuterVolumeSpecName: "kube-api-access-jwgqp") pod "deae8388-32f8-45a9-959e-a0b6e758d873" (UID: "deae8388-32f8-45a9-959e-a0b6e758d873"). InnerVolumeSpecName "kube-api-access-jwgqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.438553 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-inventory" (OuterVolumeSpecName: "inventory") pod "deae8388-32f8-45a9-959e-a0b6e758d873" (UID: "deae8388-32f8-45a9-959e-a0b6e758d873"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.440897 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "deae8388-32f8-45a9-959e-a0b6e758d873" (UID: "deae8388-32f8-45a9-959e-a0b6e758d873"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.502269 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwgqp\" (UniqueName: \"kubernetes.io/projected/deae8388-32f8-45a9-959e-a0b6e758d873-kube-api-access-jwgqp\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.502307 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.502320 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/deae8388-32f8-45a9-959e-a0b6e758d873-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.918613 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" event={"ID":"deae8388-32f8-45a9-959e-a0b6e758d873","Type":"ContainerDied","Data":"580cb74ebcf76535930c3d5b28bfa78151cc226dc880884e4d27cff86b40f680"} Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.918661 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-8qfmp" Oct 10 08:51:42 crc kubenswrapper[4732]: I1010 08:51:42.918667 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="580cb74ebcf76535930c3d5b28bfa78151cc226dc880884e4d27cff86b40f680" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.096990 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-thp4h"] Oct 10 08:51:43 crc kubenswrapper[4732]: E1010 08:51:43.097482 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deae8388-32f8-45a9-959e-a0b6e758d873" containerName="validate-network-openstack-openstack-cell1" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.097504 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="deae8388-32f8-45a9-959e-a0b6e758d873" containerName="validate-network-openstack-openstack-cell1" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.097776 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="deae8388-32f8-45a9-959e-a0b6e758d873" containerName="validate-network-openstack-openstack-cell1" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.098640 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.102002 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.106093 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.106423 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.125074 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.134178 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-thp4h"] Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.216354 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk29z\" (UniqueName: \"kubernetes.io/projected/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-kube-api-access-bk29z\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.216452 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-inventory\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.216532 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-ssh-key\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.318001 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-ssh-key\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.318334 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk29z\" (UniqueName: \"kubernetes.io/projected/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-kube-api-access-bk29z\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.318396 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-inventory\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.323328 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-ssh-key\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.324292 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-inventory\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.337434 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk29z\" (UniqueName: \"kubernetes.io/projected/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-kube-api-access-bk29z\") pod \"install-os-openstack-openstack-cell1-thp4h\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:43 crc kubenswrapper[4732]: I1010 08:51:43.482484 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:51:44 crc kubenswrapper[4732]: I1010 08:51:44.081547 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-thp4h"] Oct 10 08:51:44 crc kubenswrapper[4732]: W1010 08:51:44.088174 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a2e5b09_abde_4b6e_925b_b2b2ff1a1346.slice/crio-bb91d84fec3a91df61b5a2b3804a16962bc21dd10c4299105aaf7df5f1f3cf76 WatchSource:0}: Error finding container bb91d84fec3a91df61b5a2b3804a16962bc21dd10c4299105aaf7df5f1f3cf76: Status 404 returned error can't find the container with id bb91d84fec3a91df61b5a2b3804a16962bc21dd10c4299105aaf7df5f1f3cf76 Oct 10 08:51:44 crc kubenswrapper[4732]: I1010 08:51:44.940453 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-thp4h" event={"ID":"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346","Type":"ContainerStarted","Data":"e15f24d06f8d92473cb8fd29feedfe18cb685e9f29696531de69604d85ae16e5"} Oct 10 08:51:44 crc kubenswrapper[4732]: I1010 08:51:44.940749 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-thp4h" event={"ID":"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346","Type":"ContainerStarted","Data":"bb91d84fec3a91df61b5a2b3804a16962bc21dd10c4299105aaf7df5f1f3cf76"} Oct 10 08:51:44 crc kubenswrapper[4732]: I1010 08:51:44.976216 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-thp4h" podStartSLOduration=1.463824802 podStartE2EDuration="1.976191161s" podCreationTimestamp="2025-10-10 08:51:43 +0000 UTC" firstStartedPulling="2025-10-10 08:51:44.090742601 +0000 UTC m=+7231.160333842" lastFinishedPulling="2025-10-10 08:51:44.60310896 +0000 UTC m=+7231.672700201" observedRunningTime="2025-10-10 08:51:44.965744579 +0000 UTC m=+7232.035335840" watchObservedRunningTime="2025-10-10 08:51:44.976191161 +0000 UTC m=+7232.045782442" Oct 10 08:51:52 crc kubenswrapper[4732]: I1010 08:51:52.660208 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:51:52 crc kubenswrapper[4732]: E1010 08:51:52.660950 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:52:03 crc kubenswrapper[4732]: I1010 08:52:03.679159 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:52:03 crc kubenswrapper[4732]: E1010 08:52:03.681261 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.555868 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xqn22"] Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.559543 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.581184 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqn22"] Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.619057 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-catalog-content\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.619163 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65gz9\" (UniqueName: \"kubernetes.io/projected/584b3c70-7fc2-4ff2-b234-5f4000604e2a-kube-api-access-65gz9\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.619510 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-utilities\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.722148 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-utilities\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.722298 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-catalog-content\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.722413 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65gz9\" (UniqueName: \"kubernetes.io/projected/584b3c70-7fc2-4ff2-b234-5f4000604e2a-kube-api-access-65gz9\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.722645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-utilities\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.722938 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-catalog-content\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.745529 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65gz9\" (UniqueName: \"kubernetes.io/projected/584b3c70-7fc2-4ff2-b234-5f4000604e2a-kube-api-access-65gz9\") pod \"redhat-marketplace-xqn22\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:08 crc kubenswrapper[4732]: I1010 08:52:08.916614 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:09 crc kubenswrapper[4732]: I1010 08:52:09.381010 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqn22"] Oct 10 08:52:10 crc kubenswrapper[4732]: I1010 08:52:10.251371 4732 generic.go:334] "Generic (PLEG): container finished" podID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerID="3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af" exitCode=0 Oct 10 08:52:10 crc kubenswrapper[4732]: I1010 08:52:10.251439 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqn22" event={"ID":"584b3c70-7fc2-4ff2-b234-5f4000604e2a","Type":"ContainerDied","Data":"3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af"} Oct 10 08:52:10 crc kubenswrapper[4732]: I1010 08:52:10.251759 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqn22" event={"ID":"584b3c70-7fc2-4ff2-b234-5f4000604e2a","Type":"ContainerStarted","Data":"c735995d69ff201ae990ae97873a3fb2b5abb5a9b4d52cc52bbff6857c560fd2"} Oct 10 08:52:12 crc kubenswrapper[4732]: I1010 08:52:12.280541 4732 generic.go:334] "Generic (PLEG): container finished" podID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerID="c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2" exitCode=0 Oct 10 08:52:12 crc kubenswrapper[4732]: I1010 08:52:12.280729 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqn22" event={"ID":"584b3c70-7fc2-4ff2-b234-5f4000604e2a","Type":"ContainerDied","Data":"c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2"} Oct 10 08:52:13 crc kubenswrapper[4732]: I1010 08:52:13.294533 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqn22" event={"ID":"584b3c70-7fc2-4ff2-b234-5f4000604e2a","Type":"ContainerStarted","Data":"3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741"} Oct 10 08:52:13 crc kubenswrapper[4732]: I1010 08:52:13.325755 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xqn22" podStartSLOduration=2.632265127 podStartE2EDuration="5.32573231s" podCreationTimestamp="2025-10-10 08:52:08 +0000 UTC" firstStartedPulling="2025-10-10 08:52:10.256791642 +0000 UTC m=+7257.326382883" lastFinishedPulling="2025-10-10 08:52:12.950258815 +0000 UTC m=+7260.019850066" observedRunningTime="2025-10-10 08:52:13.315881894 +0000 UTC m=+7260.385473155" watchObservedRunningTime="2025-10-10 08:52:13.32573231 +0000 UTC m=+7260.395323561" Oct 10 08:52:14 crc kubenswrapper[4732]: I1010 08:52:14.662280 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:52:14 crc kubenswrapper[4732]: E1010 08:52:14.666465 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:52:18 crc kubenswrapper[4732]: I1010 08:52:18.917833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:18 crc kubenswrapper[4732]: I1010 08:52:18.918661 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:19 crc kubenswrapper[4732]: I1010 08:52:19.016393 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:19 crc kubenswrapper[4732]: I1010 08:52:19.450412 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:19 crc kubenswrapper[4732]: I1010 08:52:19.500023 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqn22"] Oct 10 08:52:21 crc kubenswrapper[4732]: I1010 08:52:21.391362 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xqn22" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="registry-server" containerID="cri-o://3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741" gracePeriod=2 Oct 10 08:52:21 crc kubenswrapper[4732]: I1010 08:52:21.936506 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.038306 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-catalog-content\") pod \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.038447 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-utilities\") pod \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.038669 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65gz9\" (UniqueName: \"kubernetes.io/projected/584b3c70-7fc2-4ff2-b234-5f4000604e2a-kube-api-access-65gz9\") pod \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\" (UID: \"584b3c70-7fc2-4ff2-b234-5f4000604e2a\") " Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.039317 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-utilities" (OuterVolumeSpecName: "utilities") pod "584b3c70-7fc2-4ff2-b234-5f4000604e2a" (UID: "584b3c70-7fc2-4ff2-b234-5f4000604e2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.048065 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584b3c70-7fc2-4ff2-b234-5f4000604e2a-kube-api-access-65gz9" (OuterVolumeSpecName: "kube-api-access-65gz9") pod "584b3c70-7fc2-4ff2-b234-5f4000604e2a" (UID: "584b3c70-7fc2-4ff2-b234-5f4000604e2a"). InnerVolumeSpecName "kube-api-access-65gz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.051293 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "584b3c70-7fc2-4ff2-b234-5f4000604e2a" (UID: "584b3c70-7fc2-4ff2-b234-5f4000604e2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.141916 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.141979 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584b3c70-7fc2-4ff2-b234-5f4000604e2a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.142004 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65gz9\" (UniqueName: \"kubernetes.io/projected/584b3c70-7fc2-4ff2-b234-5f4000604e2a-kube-api-access-65gz9\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.407408 4732 generic.go:334] "Generic (PLEG): container finished" podID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerID="3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741" exitCode=0 Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.407473 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqn22" event={"ID":"584b3c70-7fc2-4ff2-b234-5f4000604e2a","Type":"ContainerDied","Data":"3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741"} Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.407516 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xqn22" event={"ID":"584b3c70-7fc2-4ff2-b234-5f4000604e2a","Type":"ContainerDied","Data":"c735995d69ff201ae990ae97873a3fb2b5abb5a9b4d52cc52bbff6857c560fd2"} Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.407533 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xqn22" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.407549 4732 scope.go:117] "RemoveContainer" containerID="3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.443923 4732 scope.go:117] "RemoveContainer" containerID="c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.461948 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqn22"] Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.471381 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xqn22"] Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.497671 4732 scope.go:117] "RemoveContainer" containerID="3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.552556 4732 scope.go:117] "RemoveContainer" containerID="3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741" Oct 10 08:52:22 crc kubenswrapper[4732]: E1010 08:52:22.553373 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741\": container with ID starting with 3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741 not found: ID does not exist" containerID="3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.553424 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741"} err="failed to get container status \"3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741\": rpc error: code = NotFound desc = could not find container \"3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741\": container with ID starting with 3f4ae6bd2a8a2b3dfaf591140b3989ac872e5b99f7dd5c6e7929ec59b82c9741 not found: ID does not exist" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.553458 4732 scope.go:117] "RemoveContainer" containerID="c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2" Oct 10 08:52:22 crc kubenswrapper[4732]: E1010 08:52:22.554272 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2\": container with ID starting with c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2 not found: ID does not exist" containerID="c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.554372 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2"} err="failed to get container status \"c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2\": rpc error: code = NotFound desc = could not find container \"c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2\": container with ID starting with c94f9405d213d0eafc4088cf52688a7b194527a0c5898ad8da1cadae6fab04d2 not found: ID does not exist" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.554621 4732 scope.go:117] "RemoveContainer" containerID="3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af" Oct 10 08:52:22 crc kubenswrapper[4732]: E1010 08:52:22.555528 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af\": container with ID starting with 3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af not found: ID does not exist" containerID="3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af" Oct 10 08:52:22 crc kubenswrapper[4732]: I1010 08:52:22.555601 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af"} err="failed to get container status \"3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af\": rpc error: code = NotFound desc = could not find container \"3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af\": container with ID starting with 3ce08aa3e54330e1000a8711f7e33108dcdefc449dbb312fadad7eceb93410af not found: ID does not exist" Oct 10 08:52:23 crc kubenswrapper[4732]: I1010 08:52:23.681064 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" path="/var/lib/kubelet/pods/584b3c70-7fc2-4ff2-b234-5f4000604e2a/volumes" Oct 10 08:52:26 crc kubenswrapper[4732]: I1010 08:52:26.660269 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:52:26 crc kubenswrapper[4732]: E1010 08:52:26.661122 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:52:32 crc kubenswrapper[4732]: I1010 08:52:32.538255 4732 generic.go:334] "Generic (PLEG): container finished" podID="4a2e5b09-abde-4b6e-925b-b2b2ff1a1346" containerID="e15f24d06f8d92473cb8fd29feedfe18cb685e9f29696531de69604d85ae16e5" exitCode=0 Oct 10 08:52:32 crc kubenswrapper[4732]: I1010 08:52:32.538339 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-thp4h" event={"ID":"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346","Type":"ContainerDied","Data":"e15f24d06f8d92473cb8fd29feedfe18cb685e9f29696531de69604d85ae16e5"} Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.038724 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.227036 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-ssh-key\") pod \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.227088 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk29z\" (UniqueName: \"kubernetes.io/projected/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-kube-api-access-bk29z\") pod \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.227106 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-inventory\") pod \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\" (UID: \"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346\") " Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.234087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-kube-api-access-bk29z" (OuterVolumeSpecName: "kube-api-access-bk29z") pod "4a2e5b09-abde-4b6e-925b-b2b2ff1a1346" (UID: "4a2e5b09-abde-4b6e-925b-b2b2ff1a1346"). InnerVolumeSpecName "kube-api-access-bk29z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.289730 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-inventory" (OuterVolumeSpecName: "inventory") pod "4a2e5b09-abde-4b6e-925b-b2b2ff1a1346" (UID: "4a2e5b09-abde-4b6e-925b-b2b2ff1a1346"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.293068 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4a2e5b09-abde-4b6e-925b-b2b2ff1a1346" (UID: "4a2e5b09-abde-4b6e-925b-b2b2ff1a1346"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.329378 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.329705 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.329725 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk29z\" (UniqueName: \"kubernetes.io/projected/4a2e5b09-abde-4b6e-925b-b2b2ff1a1346-kube-api-access-bk29z\") on node \"crc\" DevicePath \"\"" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.560533 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-thp4h" event={"ID":"4a2e5b09-abde-4b6e-925b-b2b2ff1a1346","Type":"ContainerDied","Data":"bb91d84fec3a91df61b5a2b3804a16962bc21dd10c4299105aaf7df5f1f3cf76"} Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.560575 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb91d84fec3a91df61b5a2b3804a16962bc21dd10c4299105aaf7df5f1f3cf76" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.560629 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-thp4h" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.688405 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-ksh2z"] Oct 10 08:52:34 crc kubenswrapper[4732]: E1010 08:52:34.689521 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="registry-server" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.689785 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="registry-server" Oct 10 08:52:34 crc kubenswrapper[4732]: E1010 08:52:34.689970 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="extract-content" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.690104 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="extract-content" Oct 10 08:52:34 crc kubenswrapper[4732]: E1010 08:52:34.690277 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2e5b09-abde-4b6e-925b-b2b2ff1a1346" containerName="install-os-openstack-openstack-cell1" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.690412 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2e5b09-abde-4b6e-925b-b2b2ff1a1346" containerName="install-os-openstack-openstack-cell1" Oct 10 08:52:34 crc kubenswrapper[4732]: E1010 08:52:34.690586 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="extract-utilities" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.690788 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="extract-utilities" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.691320 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2e5b09-abde-4b6e-925b-b2b2ff1a1346" containerName="install-os-openstack-openstack-cell1" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.691518 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="584b3c70-7fc2-4ff2-b234-5f4000604e2a" containerName="registry-server" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.692974 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.695392 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.695472 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.696417 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.697716 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.697991 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-ksh2z"] Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.841252 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vds2m\" (UniqueName: \"kubernetes.io/projected/1690368c-3a87-48f6-9b22-ca371199b4dd-kube-api-access-vds2m\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.841290 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-ssh-key\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.841314 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-inventory\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.943921 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vds2m\" (UniqueName: \"kubernetes.io/projected/1690368c-3a87-48f6-9b22-ca371199b4dd-kube-api-access-vds2m\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.943983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-ssh-key\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.944033 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-inventory\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.948841 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-ssh-key\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.952324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-inventory\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:34 crc kubenswrapper[4732]: I1010 08:52:34.975447 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vds2m\" (UniqueName: \"kubernetes.io/projected/1690368c-3a87-48f6-9b22-ca371199b4dd-kube-api-access-vds2m\") pod \"configure-os-openstack-openstack-cell1-ksh2z\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:35 crc kubenswrapper[4732]: I1010 08:52:35.012888 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:52:35 crc kubenswrapper[4732]: I1010 08:52:35.401580 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-ksh2z"] Oct 10 08:52:35 crc kubenswrapper[4732]: I1010 08:52:35.576209 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" event={"ID":"1690368c-3a87-48f6-9b22-ca371199b4dd","Type":"ContainerStarted","Data":"8444865854e0ade9ba200a8f3429fab758bf52475f60cc99ab03f797d794fac8"} Oct 10 08:52:36 crc kubenswrapper[4732]: I1010 08:52:36.589552 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" event={"ID":"1690368c-3a87-48f6-9b22-ca371199b4dd","Type":"ContainerStarted","Data":"3999a31c1a9929d1d932fe1de0640b7292eae93971b519ccfe18f9cf16714c26"} Oct 10 08:52:36 crc kubenswrapper[4732]: I1010 08:52:36.626340 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" podStartSLOduration=2.009533109 podStartE2EDuration="2.626310938s" podCreationTimestamp="2025-10-10 08:52:34 +0000 UTC" firstStartedPulling="2025-10-10 08:52:35.397850858 +0000 UTC m=+7282.467442099" lastFinishedPulling="2025-10-10 08:52:36.014628687 +0000 UTC m=+7283.084219928" observedRunningTime="2025-10-10 08:52:36.61415323 +0000 UTC m=+7283.683744551" watchObservedRunningTime="2025-10-10 08:52:36.626310938 +0000 UTC m=+7283.695902219" Oct 10 08:52:39 crc kubenswrapper[4732]: I1010 08:52:39.662261 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:52:39 crc kubenswrapper[4732]: E1010 08:52:39.662903 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:52:54 crc kubenswrapper[4732]: I1010 08:52:54.660636 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:52:54 crc kubenswrapper[4732]: E1010 08:52:54.661756 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:53:09 crc kubenswrapper[4732]: I1010 08:53:09.660525 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:53:09 crc kubenswrapper[4732]: I1010 08:53:09.955256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"0337e2dd4003adeeedb922754f25d102fb276bba03a83e971b1f685a707e8c87"} Oct 10 08:53:29 crc kubenswrapper[4732]: I1010 08:53:29.209271 4732 generic.go:334] "Generic (PLEG): container finished" podID="1690368c-3a87-48f6-9b22-ca371199b4dd" containerID="3999a31c1a9929d1d932fe1de0640b7292eae93971b519ccfe18f9cf16714c26" exitCode=0 Oct 10 08:53:29 crc kubenswrapper[4732]: I1010 08:53:29.209339 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" event={"ID":"1690368c-3a87-48f6-9b22-ca371199b4dd","Type":"ContainerDied","Data":"3999a31c1a9929d1d932fe1de0640b7292eae93971b519ccfe18f9cf16714c26"} Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.651826 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.805506 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-inventory\") pod \"1690368c-3a87-48f6-9b22-ca371199b4dd\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.806095 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vds2m\" (UniqueName: \"kubernetes.io/projected/1690368c-3a87-48f6-9b22-ca371199b4dd-kube-api-access-vds2m\") pod \"1690368c-3a87-48f6-9b22-ca371199b4dd\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.806167 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-ssh-key\") pod \"1690368c-3a87-48f6-9b22-ca371199b4dd\" (UID: \"1690368c-3a87-48f6-9b22-ca371199b4dd\") " Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.811807 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1690368c-3a87-48f6-9b22-ca371199b4dd-kube-api-access-vds2m" (OuterVolumeSpecName: "kube-api-access-vds2m") pod "1690368c-3a87-48f6-9b22-ca371199b4dd" (UID: "1690368c-3a87-48f6-9b22-ca371199b4dd"). InnerVolumeSpecName "kube-api-access-vds2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.833755 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-inventory" (OuterVolumeSpecName: "inventory") pod "1690368c-3a87-48f6-9b22-ca371199b4dd" (UID: "1690368c-3a87-48f6-9b22-ca371199b4dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.841011 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1690368c-3a87-48f6-9b22-ca371199b4dd" (UID: "1690368c-3a87-48f6-9b22-ca371199b4dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.910097 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vds2m\" (UniqueName: \"kubernetes.io/projected/1690368c-3a87-48f6-9b22-ca371199b4dd-kube-api-access-vds2m\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.910151 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:30 crc kubenswrapper[4732]: I1010 08:53:30.910174 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1690368c-3a87-48f6-9b22-ca371199b4dd-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.234576 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" event={"ID":"1690368c-3a87-48f6-9b22-ca371199b4dd","Type":"ContainerDied","Data":"8444865854e0ade9ba200a8f3429fab758bf52475f60cc99ab03f797d794fac8"} Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.234626 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8444865854e0ade9ba200a8f3429fab758bf52475f60cc99ab03f797d794fac8" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.234717 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-ksh2z" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.337408 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-scp5h"] Oct 10 08:53:31 crc kubenswrapper[4732]: E1010 08:53:31.337897 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1690368c-3a87-48f6-9b22-ca371199b4dd" containerName="configure-os-openstack-openstack-cell1" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.337920 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1690368c-3a87-48f6-9b22-ca371199b4dd" containerName="configure-os-openstack-openstack-cell1" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.338313 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1690368c-3a87-48f6-9b22-ca371199b4dd" containerName="configure-os-openstack-openstack-cell1" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.339156 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.342112 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.342673 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.343304 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.345526 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.347453 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-scp5h"] Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.522905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xhrt\" (UniqueName: \"kubernetes.io/projected/5d62c0cf-4132-4b71-af85-ca430cab6a8f-kube-api-access-4xhrt\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.523119 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-inventory-0\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.523247 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.625873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-inventory-0\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.626070 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.626158 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xhrt\" (UniqueName: \"kubernetes.io/projected/5d62c0cf-4132-4b71-af85-ca430cab6a8f-kube-api-access-4xhrt\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.632586 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-inventory-0\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.635789 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.646620 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xhrt\" (UniqueName: \"kubernetes.io/projected/5d62c0cf-4132-4b71-af85-ca430cab6a8f-kube-api-access-4xhrt\") pod \"ssh-known-hosts-openstack-scp5h\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:31 crc kubenswrapper[4732]: I1010 08:53:31.669414 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:32 crc kubenswrapper[4732]: I1010 08:53:32.327400 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-scp5h"] Oct 10 08:53:33 crc kubenswrapper[4732]: I1010 08:53:33.272547 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-scp5h" event={"ID":"5d62c0cf-4132-4b71-af85-ca430cab6a8f","Type":"ContainerStarted","Data":"b40bae73a588085b994dbb1924d1d6c139db3b8055b28baa8b9dea823e26bc3e"} Oct 10 08:53:34 crc kubenswrapper[4732]: I1010 08:53:34.295035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-scp5h" event={"ID":"5d62c0cf-4132-4b71-af85-ca430cab6a8f","Type":"ContainerStarted","Data":"de13d723c139fac70a0bb86d679b7d645bdd3853953ed3eb2a090f173bf82cad"} Oct 10 08:53:34 crc kubenswrapper[4732]: I1010 08:53:34.331709 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-scp5h" podStartSLOduration=2.519295699 podStartE2EDuration="3.331668017s" podCreationTimestamp="2025-10-10 08:53:31 +0000 UTC" firstStartedPulling="2025-10-10 08:53:32.327921021 +0000 UTC m=+7339.397512302" lastFinishedPulling="2025-10-10 08:53:33.140293359 +0000 UTC m=+7340.209884620" observedRunningTime="2025-10-10 08:53:34.328643796 +0000 UTC m=+7341.398235067" watchObservedRunningTime="2025-10-10 08:53:34.331668017 +0000 UTC m=+7341.401259268" Oct 10 08:53:43 crc kubenswrapper[4732]: I1010 08:53:43.403640 4732 generic.go:334] "Generic (PLEG): container finished" podID="5d62c0cf-4132-4b71-af85-ca430cab6a8f" containerID="de13d723c139fac70a0bb86d679b7d645bdd3853953ed3eb2a090f173bf82cad" exitCode=0 Oct 10 08:53:43 crc kubenswrapper[4732]: I1010 08:53:43.404110 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-scp5h" event={"ID":"5d62c0cf-4132-4b71-af85-ca430cab6a8f","Type":"ContainerDied","Data":"de13d723c139fac70a0bb86d679b7d645bdd3853953ed3eb2a090f173bf82cad"} Oct 10 08:53:44 crc kubenswrapper[4732]: I1010 08:53:44.881438 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:44 crc kubenswrapper[4732]: I1010 08:53:44.935634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xhrt\" (UniqueName: \"kubernetes.io/projected/5d62c0cf-4132-4b71-af85-ca430cab6a8f-kube-api-access-4xhrt\") pod \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " Oct 10 08:53:44 crc kubenswrapper[4732]: I1010 08:53:44.935809 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-inventory-0\") pod \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " Oct 10 08:53:44 crc kubenswrapper[4732]: I1010 08:53:44.935874 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-ssh-key-openstack-cell1\") pod \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\" (UID: \"5d62c0cf-4132-4b71-af85-ca430cab6a8f\") " Oct 10 08:53:44 crc kubenswrapper[4732]: I1010 08:53:44.944104 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d62c0cf-4132-4b71-af85-ca430cab6a8f-kube-api-access-4xhrt" (OuterVolumeSpecName: "kube-api-access-4xhrt") pod "5d62c0cf-4132-4b71-af85-ca430cab6a8f" (UID: "5d62c0cf-4132-4b71-af85-ca430cab6a8f"). InnerVolumeSpecName "kube-api-access-4xhrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:53:44 crc kubenswrapper[4732]: I1010 08:53:44.966284 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5d62c0cf-4132-4b71-af85-ca430cab6a8f" (UID: "5d62c0cf-4132-4b71-af85-ca430cab6a8f"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:53:44 crc kubenswrapper[4732]: I1010 08:53:44.967682 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5d62c0cf-4132-4b71-af85-ca430cab6a8f" (UID: "5d62c0cf-4132-4b71-af85-ca430cab6a8f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.037169 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xhrt\" (UniqueName: \"kubernetes.io/projected/5d62c0cf-4132-4b71-af85-ca430cab6a8f-kube-api-access-4xhrt\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.037211 4732 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.037224 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5d62c0cf-4132-4b71-af85-ca430cab6a8f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.426650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-scp5h" event={"ID":"5d62c0cf-4132-4b71-af85-ca430cab6a8f","Type":"ContainerDied","Data":"b40bae73a588085b994dbb1924d1d6c139db3b8055b28baa8b9dea823e26bc3e"} Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.426720 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-scp5h" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.426751 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b40bae73a588085b994dbb1924d1d6c139db3b8055b28baa8b9dea823e26bc3e" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.499592 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-gcl2g"] Oct 10 08:53:45 crc kubenswrapper[4732]: E1010 08:53:45.500196 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d62c0cf-4132-4b71-af85-ca430cab6a8f" containerName="ssh-known-hosts-openstack" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.500261 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d62c0cf-4132-4b71-af85-ca430cab6a8f" containerName="ssh-known-hosts-openstack" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.500506 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d62c0cf-4132-4b71-af85-ca430cab6a8f" containerName="ssh-known-hosts-openstack" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.501281 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.503454 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.503685 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.504033 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.504371 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.522061 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-gcl2g"] Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.651930 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkwhr\" (UniqueName: \"kubernetes.io/projected/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-kube-api-access-lkwhr\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.652018 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-inventory\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.652103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-ssh-key\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.754555 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-ssh-key\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.755687 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkwhr\" (UniqueName: \"kubernetes.io/projected/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-kube-api-access-lkwhr\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.756696 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-inventory\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.758934 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-ssh-key\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.760821 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-inventory\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.783521 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkwhr\" (UniqueName: \"kubernetes.io/projected/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-kube-api-access-lkwhr\") pod \"run-os-openstack-openstack-cell1-gcl2g\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:45 crc kubenswrapper[4732]: I1010 08:53:45.824110 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:46 crc kubenswrapper[4732]: I1010 08:53:46.348492 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-gcl2g"] Oct 10 08:53:46 crc kubenswrapper[4732]: W1010 08:53:46.351161 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6b83b95_f247_43e6_a1c3_3d3a2cb6cb5b.slice/crio-df77d8993e413137d07fd70a5447002b72d13d09b0bdaaf4444b76e6d35b7bf5 WatchSource:0}: Error finding container df77d8993e413137d07fd70a5447002b72d13d09b0bdaaf4444b76e6d35b7bf5: Status 404 returned error can't find the container with id df77d8993e413137d07fd70a5447002b72d13d09b0bdaaf4444b76e6d35b7bf5 Oct 10 08:53:46 crc kubenswrapper[4732]: I1010 08:53:46.443161 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" event={"ID":"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b","Type":"ContainerStarted","Data":"df77d8993e413137d07fd70a5447002b72d13d09b0bdaaf4444b76e6d35b7bf5"} Oct 10 08:53:47 crc kubenswrapper[4732]: I1010 08:53:47.454811 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" event={"ID":"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b","Type":"ContainerStarted","Data":"7a40e347858b910aa7360c8825a8cffa2402f4879e412025438bb4eef6391a3b"} Oct 10 08:53:47 crc kubenswrapper[4732]: I1010 08:53:47.476597 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" podStartSLOduration=1.789691598 podStartE2EDuration="2.476571799s" podCreationTimestamp="2025-10-10 08:53:45 +0000 UTC" firstStartedPulling="2025-10-10 08:53:46.353657359 +0000 UTC m=+7353.423248600" lastFinishedPulling="2025-10-10 08:53:47.04053756 +0000 UTC m=+7354.110128801" observedRunningTime="2025-10-10 08:53:47.473005973 +0000 UTC m=+7354.542597224" watchObservedRunningTime="2025-10-10 08:53:47.476571799 +0000 UTC m=+7354.546163070" Oct 10 08:53:55 crc kubenswrapper[4732]: I1010 08:53:55.537375 4732 generic.go:334] "Generic (PLEG): container finished" podID="b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b" containerID="7a40e347858b910aa7360c8825a8cffa2402f4879e412025438bb4eef6391a3b" exitCode=0 Oct 10 08:53:55 crc kubenswrapper[4732]: I1010 08:53:55.537471 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" event={"ID":"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b","Type":"ContainerDied","Data":"7a40e347858b910aa7360c8825a8cffa2402f4879e412025438bb4eef6391a3b"} Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.166819 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.322104 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-ssh-key\") pod \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.322759 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkwhr\" (UniqueName: \"kubernetes.io/projected/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-kube-api-access-lkwhr\") pod \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.322796 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-inventory\") pod \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\" (UID: \"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b\") " Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.330432 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-kube-api-access-lkwhr" (OuterVolumeSpecName: "kube-api-access-lkwhr") pod "b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b" (UID: "b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b"). InnerVolumeSpecName "kube-api-access-lkwhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.356164 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-inventory" (OuterVolumeSpecName: "inventory") pod "b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b" (UID: "b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.361942 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b" (UID: "b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.425310 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkwhr\" (UniqueName: \"kubernetes.io/projected/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-kube-api-access-lkwhr\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.425356 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.425370 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.562028 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" event={"ID":"b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b","Type":"ContainerDied","Data":"df77d8993e413137d07fd70a5447002b72d13d09b0bdaaf4444b76e6d35b7bf5"} Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.562067 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df77d8993e413137d07fd70a5447002b72d13d09b0bdaaf4444b76e6d35b7bf5" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.562130 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-gcl2g" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.658175 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-rg9rf"] Oct 10 08:53:57 crc kubenswrapper[4732]: E1010 08:53:57.658720 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b" containerName="run-os-openstack-openstack-cell1" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.658743 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b" containerName="run-os-openstack-openstack-cell1" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.659019 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b" containerName="run-os-openstack-openstack-cell1" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.661909 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.664791 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.675928 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.675972 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.675975 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.688994 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-rg9rf"] Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.835391 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r498b\" (UniqueName: \"kubernetes.io/projected/93fa28a0-a777-44b4-8649-fda723a616d7-kube-api-access-r498b\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.835799 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-inventory\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.836005 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.938242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-inventory\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.938386 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.938469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r498b\" (UniqueName: \"kubernetes.io/projected/93fa28a0-a777-44b4-8649-fda723a616d7-kube-api-access-r498b\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.943763 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.953979 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-inventory\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:57 crc kubenswrapper[4732]: I1010 08:53:57.959553 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r498b\" (UniqueName: \"kubernetes.io/projected/93fa28a0-a777-44b4-8649-fda723a616d7-kube-api-access-r498b\") pod \"reboot-os-openstack-openstack-cell1-rg9rf\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:58 crc kubenswrapper[4732]: I1010 08:53:58.004524 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:53:58 crc kubenswrapper[4732]: I1010 08:53:58.573490 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-rg9rf"] Oct 10 08:53:59 crc kubenswrapper[4732]: I1010 08:53:59.583789 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" event={"ID":"93fa28a0-a777-44b4-8649-fda723a616d7","Type":"ContainerStarted","Data":"c4a7ea0fef17cf6f76586760d53fdb7061be8fef649b122c9fc90080f90ccf12"} Oct 10 08:53:59 crc kubenswrapper[4732]: I1010 08:53:59.584444 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" event={"ID":"93fa28a0-a777-44b4-8649-fda723a616d7","Type":"ContainerStarted","Data":"7dbfd759bfcbed60188a4bda854c15fdea9031886f79dd058a7ad47ffea40457"} Oct 10 08:53:59 crc kubenswrapper[4732]: I1010 08:53:59.613169 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" podStartSLOduration=2.000452326 podStartE2EDuration="2.613143244s" podCreationTimestamp="2025-10-10 08:53:57 +0000 UTC" firstStartedPulling="2025-10-10 08:53:58.582267298 +0000 UTC m=+7365.651858549" lastFinishedPulling="2025-10-10 08:53:59.194958206 +0000 UTC m=+7366.264549467" observedRunningTime="2025-10-10 08:53:59.606155235 +0000 UTC m=+7366.675746536" watchObservedRunningTime="2025-10-10 08:53:59.613143244 +0000 UTC m=+7366.682734495" Oct 10 08:54:15 crc kubenswrapper[4732]: I1010 08:54:15.778173 4732 generic.go:334] "Generic (PLEG): container finished" podID="93fa28a0-a777-44b4-8649-fda723a616d7" containerID="c4a7ea0fef17cf6f76586760d53fdb7061be8fef649b122c9fc90080f90ccf12" exitCode=0 Oct 10 08:54:15 crc kubenswrapper[4732]: I1010 08:54:15.778429 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" event={"ID":"93fa28a0-a777-44b4-8649-fda723a616d7","Type":"ContainerDied","Data":"c4a7ea0fef17cf6f76586760d53fdb7061be8fef649b122c9fc90080f90ccf12"} Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.347572 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.466879 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r498b\" (UniqueName: \"kubernetes.io/projected/93fa28a0-a777-44b4-8649-fda723a616d7-kube-api-access-r498b\") pod \"93fa28a0-a777-44b4-8649-fda723a616d7\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.467057 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-ssh-key\") pod \"93fa28a0-a777-44b4-8649-fda723a616d7\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.467245 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-inventory\") pod \"93fa28a0-a777-44b4-8649-fda723a616d7\" (UID: \"93fa28a0-a777-44b4-8649-fda723a616d7\") " Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.473086 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93fa28a0-a777-44b4-8649-fda723a616d7-kube-api-access-r498b" (OuterVolumeSpecName: "kube-api-access-r498b") pod "93fa28a0-a777-44b4-8649-fda723a616d7" (UID: "93fa28a0-a777-44b4-8649-fda723a616d7"). InnerVolumeSpecName "kube-api-access-r498b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.501827 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-inventory" (OuterVolumeSpecName: "inventory") pod "93fa28a0-a777-44b4-8649-fda723a616d7" (UID: "93fa28a0-a777-44b4-8649-fda723a616d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.502266 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "93fa28a0-a777-44b4-8649-fda723a616d7" (UID: "93fa28a0-a777-44b4-8649-fda723a616d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.570398 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.570477 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r498b\" (UniqueName: \"kubernetes.io/projected/93fa28a0-a777-44b4-8649-fda723a616d7-kube-api-access-r498b\") on node \"crc\" DevicePath \"\"" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.570505 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93fa28a0-a777-44b4-8649-fda723a616d7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.803753 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" event={"ID":"93fa28a0-a777-44b4-8649-fda723a616d7","Type":"ContainerDied","Data":"7dbfd759bfcbed60188a4bda854c15fdea9031886f79dd058a7ad47ffea40457"} Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.803817 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-rg9rf" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.803822 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dbfd759bfcbed60188a4bda854c15fdea9031886f79dd058a7ad47ffea40457" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.964454 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-swd76"] Oct 10 08:54:17 crc kubenswrapper[4732]: E1010 08:54:17.964873 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93fa28a0-a777-44b4-8649-fda723a616d7" containerName="reboot-os-openstack-openstack-cell1" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.964886 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="93fa28a0-a777-44b4-8649-fda723a616d7" containerName="reboot-os-openstack-openstack-cell1" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.965100 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="93fa28a0-a777-44b4-8649-fda723a616d7" containerName="reboot-os-openstack-openstack-cell1" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.965882 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.968429 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.968568 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.968669 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.969021 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.969196 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.969311 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.969469 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.974714 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:54:17 crc kubenswrapper[4732]: I1010 08:54:17.982363 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-swd76"] Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.081514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.081897 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.081941 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.081962 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.081977 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.081994 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ssh-key\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082023 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-inventory\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082056 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082072 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd59j\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-kube-api-access-sd59j\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082093 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082112 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082152 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082202 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082241 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.082257 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195241 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195410 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195488 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195560 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195611 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ssh-key\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195727 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-inventory\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195836 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd59j\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-kube-api-access-sd59j\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.195978 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.196097 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.196235 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.196577 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.196812 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.196880 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.197049 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.217931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.218595 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.218640 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.221257 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.223286 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.225780 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.225923 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-inventory\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.226364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.227102 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd59j\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-kube-api-access-sd59j\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.229665 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.229677 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ssh-key\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.231098 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.239222 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.241225 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.241521 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-swd76\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.282716 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:54:18 crc kubenswrapper[4732]: I1010 08:54:18.932985 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-swd76"] Oct 10 08:54:19 crc kubenswrapper[4732]: I1010 08:54:19.836371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-swd76" event={"ID":"9f873c30-bbd6-4e94-9cff-9c998ab92b9c","Type":"ContainerStarted","Data":"c91ecdaf703094006664d832a90da98939229351cb1989c7f7b59b85a656ee65"} Oct 10 08:54:19 crc kubenswrapper[4732]: I1010 08:54:19.836680 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-swd76" event={"ID":"9f873c30-bbd6-4e94-9cff-9c998ab92b9c","Type":"ContainerStarted","Data":"7c5c25a4d30796b3039ddbc20cb99098ee78f60a1a4c433a34554d0feae25b26"} Oct 10 08:54:19 crc kubenswrapper[4732]: I1010 08:54:19.866525 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-swd76" podStartSLOduration=2.414487557 podStartE2EDuration="2.866501428s" podCreationTimestamp="2025-10-10 08:54:17 +0000 UTC" firstStartedPulling="2025-10-10 08:54:18.919924387 +0000 UTC m=+7385.989537809" lastFinishedPulling="2025-10-10 08:54:19.371960429 +0000 UTC m=+7386.441551680" observedRunningTime="2025-10-10 08:54:19.857026462 +0000 UTC m=+7386.926617723" watchObservedRunningTime="2025-10-10 08:54:19.866501428 +0000 UTC m=+7386.936092679" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.207939 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d542l"] Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.214166 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.226894 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d542l"] Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.316738 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-catalog-content\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.317153 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78m2\" (UniqueName: \"kubernetes.io/projected/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-kube-api-access-g78m2\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.317821 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-utilities\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.419733 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78m2\" (UniqueName: \"kubernetes.io/projected/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-kube-api-access-g78m2\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.419989 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-utilities\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.420032 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-catalog-content\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.420523 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-utilities\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.420650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-catalog-content\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.439133 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78m2\" (UniqueName: \"kubernetes.io/projected/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-kube-api-access-g78m2\") pod \"community-operators-d542l\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:27 crc kubenswrapper[4732]: I1010 08:54:27.547970 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:28 crc kubenswrapper[4732]: I1010 08:54:28.099291 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d542l"] Oct 10 08:54:28 crc kubenswrapper[4732]: I1010 08:54:28.952858 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerID="a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998" exitCode=0 Oct 10 08:54:28 crc kubenswrapper[4732]: I1010 08:54:28.952957 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d542l" event={"ID":"7d3eb701-e7a2-4092-8bc6-e33f4a84c938","Type":"ContainerDied","Data":"a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998"} Oct 10 08:54:28 crc kubenswrapper[4732]: I1010 08:54:28.953936 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d542l" event={"ID":"7d3eb701-e7a2-4092-8bc6-e33f4a84c938","Type":"ContainerStarted","Data":"58d2a3f1c00722fd49214b8b6363c5d9f35211bc141da4edf3ebd1910ee702bc"} Oct 10 08:54:29 crc kubenswrapper[4732]: I1010 08:54:29.964546 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d542l" event={"ID":"7d3eb701-e7a2-4092-8bc6-e33f4a84c938","Type":"ContainerStarted","Data":"3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72"} Oct 10 08:54:30 crc kubenswrapper[4732]: I1010 08:54:30.987527 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerID="3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72" exitCode=0 Oct 10 08:54:30 crc kubenswrapper[4732]: I1010 08:54:30.987644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d542l" event={"ID":"7d3eb701-e7a2-4092-8bc6-e33f4a84c938","Type":"ContainerDied","Data":"3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72"} Oct 10 08:54:31 crc kubenswrapper[4732]: I1010 08:54:31.998763 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d542l" event={"ID":"7d3eb701-e7a2-4092-8bc6-e33f4a84c938","Type":"ContainerStarted","Data":"4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a"} Oct 10 08:54:32 crc kubenswrapper[4732]: I1010 08:54:32.027972 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d542l" podStartSLOduration=2.4408262609999998 podStartE2EDuration="5.027955074s" podCreationTimestamp="2025-10-10 08:54:27 +0000 UTC" firstStartedPulling="2025-10-10 08:54:28.956442606 +0000 UTC m=+7396.026033887" lastFinishedPulling="2025-10-10 08:54:31.543571419 +0000 UTC m=+7398.613162700" observedRunningTime="2025-10-10 08:54:32.026003101 +0000 UTC m=+7399.095594392" watchObservedRunningTime="2025-10-10 08:54:32.027955074 +0000 UTC m=+7399.097546315" Oct 10 08:54:37 crc kubenswrapper[4732]: I1010 08:54:37.549052 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:37 crc kubenswrapper[4732]: I1010 08:54:37.549571 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:37 crc kubenswrapper[4732]: I1010 08:54:37.615854 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:38 crc kubenswrapper[4732]: I1010 08:54:38.144571 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:38 crc kubenswrapper[4732]: I1010 08:54:38.204740 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d542l"] Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.095569 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d542l" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="registry-server" containerID="cri-o://4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a" gracePeriod=2 Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.562005 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.642536 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-catalog-content\") pod \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.642627 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-utilities\") pod \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.642763 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g78m2\" (UniqueName: \"kubernetes.io/projected/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-kube-api-access-g78m2\") pod \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\" (UID: \"7d3eb701-e7a2-4092-8bc6-e33f4a84c938\") " Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.644348 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-utilities" (OuterVolumeSpecName: "utilities") pod "7d3eb701-e7a2-4092-8bc6-e33f4a84c938" (UID: "7d3eb701-e7a2-4092-8bc6-e33f4a84c938"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.665286 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-kube-api-access-g78m2" (OuterVolumeSpecName: "kube-api-access-g78m2") pod "7d3eb701-e7a2-4092-8bc6-e33f4a84c938" (UID: "7d3eb701-e7a2-4092-8bc6-e33f4a84c938"). InnerVolumeSpecName "kube-api-access-g78m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.704616 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d3eb701-e7a2-4092-8bc6-e33f4a84c938" (UID: "7d3eb701-e7a2-4092-8bc6-e33f4a84c938"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.745439 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.745480 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:54:40 crc kubenswrapper[4732]: I1010 08:54:40.745492 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g78m2\" (UniqueName: \"kubernetes.io/projected/7d3eb701-e7a2-4092-8bc6-e33f4a84c938-kube-api-access-g78m2\") on node \"crc\" DevicePath \"\"" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.110341 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerID="4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a" exitCode=0 Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.110376 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d542l" event={"ID":"7d3eb701-e7a2-4092-8bc6-e33f4a84c938","Type":"ContainerDied","Data":"4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a"} Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.110399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d542l" event={"ID":"7d3eb701-e7a2-4092-8bc6-e33f4a84c938","Type":"ContainerDied","Data":"58d2a3f1c00722fd49214b8b6363c5d9f35211bc141da4edf3ebd1910ee702bc"} Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.110416 4732 scope.go:117] "RemoveContainer" containerID="4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.110552 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d542l" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.152098 4732 scope.go:117] "RemoveContainer" containerID="3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.164990 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d542l"] Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.175933 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d542l"] Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.194845 4732 scope.go:117] "RemoveContainer" containerID="a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.245921 4732 scope.go:117] "RemoveContainer" containerID="4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a" Oct 10 08:54:41 crc kubenswrapper[4732]: E1010 08:54:41.246554 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a\": container with ID starting with 4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a not found: ID does not exist" containerID="4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.246601 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a"} err="failed to get container status \"4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a\": rpc error: code = NotFound desc = could not find container \"4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a\": container with ID starting with 4798dd510a7984b8f53698818c9e4e94236658a50821ba926293601fe0c9434a not found: ID does not exist" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.246629 4732 scope.go:117] "RemoveContainer" containerID="3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72" Oct 10 08:54:41 crc kubenswrapper[4732]: E1010 08:54:41.247032 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72\": container with ID starting with 3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72 not found: ID does not exist" containerID="3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.247058 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72"} err="failed to get container status \"3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72\": rpc error: code = NotFound desc = could not find container \"3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72\": container with ID starting with 3ee23e8f6595f2a42d984ea4d3a435644c6eb418f797e92ffeed1e8357ec8a72 not found: ID does not exist" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.247075 4732 scope.go:117] "RemoveContainer" containerID="a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998" Oct 10 08:54:41 crc kubenswrapper[4732]: E1010 08:54:41.247510 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998\": container with ID starting with a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998 not found: ID does not exist" containerID="a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.247650 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998"} err="failed to get container status \"a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998\": rpc error: code = NotFound desc = could not find container \"a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998\": container with ID starting with a9965d37a668e66d2f86a41b8a0bcbf28a9700202b0e2d752438e998654d3998 not found: ID does not exist" Oct 10 08:54:41 crc kubenswrapper[4732]: I1010 08:54:41.682927 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" path="/var/lib/kubelet/pods/7d3eb701-e7a2-4092-8bc6-e33f4a84c938/volumes" Oct 10 08:54:59 crc kubenswrapper[4732]: I1010 08:54:59.337940 4732 generic.go:334] "Generic (PLEG): container finished" podID="9f873c30-bbd6-4e94-9cff-9c998ab92b9c" containerID="c91ecdaf703094006664d832a90da98939229351cb1989c7f7b59b85a656ee65" exitCode=0 Oct 10 08:54:59 crc kubenswrapper[4732]: I1010 08:54:59.338075 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-swd76" event={"ID":"9f873c30-bbd6-4e94-9cff-9c998ab92b9c","Type":"ContainerDied","Data":"c91ecdaf703094006664d832a90da98939229351cb1989c7f7b59b85a656ee65"} Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.814630 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.920727 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-libvirt-default-certs-0\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921074 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ssh-key\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921185 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-dhcp-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921267 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-ovn-default-certs-0\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-inventory\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921514 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd59j\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-kube-api-access-sd59j\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-sriov-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921685 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-nova-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921815 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-libvirt-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921901 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-neutron-metadata-default-certs-0\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.921978 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-bootstrap-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.922150 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-telemetry-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.922256 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-metadata-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.922391 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ovn-combined-ca-bundle\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.922498 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-telemetry-default-certs-0\") pod \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\" (UID: \"9f873c30-bbd6-4e94-9cff-9c998ab92b9c\") " Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.929390 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.929412 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.929508 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-kube-api-access-sd59j" (OuterVolumeSpecName: "kube-api-access-sd59j") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "kube-api-access-sd59j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.931443 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.932473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.932973 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.933079 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.933253 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.933710 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.935410 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.936213 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.936871 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.938846 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.961318 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:00 crc kubenswrapper[4732]: I1010 08:55:00.963846 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-inventory" (OuterVolumeSpecName: "inventory") pod "9f873c30-bbd6-4e94-9cff-9c998ab92b9c" (UID: "9f873c30-bbd6-4e94-9cff-9c998ab92b9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.026508 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.026940 4732 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.027074 4732 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.027204 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.027338 4732 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.027464 4732 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.027636 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.027911 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.028036 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.028153 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.028277 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.028406 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.028531 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.028646 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.028809 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd59j\" (UniqueName: \"kubernetes.io/projected/9f873c30-bbd6-4e94-9cff-9c998ab92b9c-kube-api-access-sd59j\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.368303 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-swd76" event={"ID":"9f873c30-bbd6-4e94-9cff-9c998ab92b9c","Type":"ContainerDied","Data":"7c5c25a4d30796b3039ddbc20cb99098ee78f60a1a4c433a34554d0feae25b26"} Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.368360 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c5c25a4d30796b3039ddbc20cb99098ee78f60a1a4c433a34554d0feae25b26" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.368423 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-swd76" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.506062 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-sczt2"] Oct 10 08:55:01 crc kubenswrapper[4732]: E1010 08:55:01.506493 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="extract-utilities" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.506506 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="extract-utilities" Oct 10 08:55:01 crc kubenswrapper[4732]: E1010 08:55:01.506516 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="extract-content" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.506522 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="extract-content" Oct 10 08:55:01 crc kubenswrapper[4732]: E1010 08:55:01.506547 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="registry-server" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.506556 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="registry-server" Oct 10 08:55:01 crc kubenswrapper[4732]: E1010 08:55:01.506573 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f873c30-bbd6-4e94-9cff-9c998ab92b9c" containerName="install-certs-openstack-openstack-cell1" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.506579 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f873c30-bbd6-4e94-9cff-9c998ab92b9c" containerName="install-certs-openstack-openstack-cell1" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.506819 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3eb701-e7a2-4092-8bc6-e33f4a84c938" containerName="registry-server" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.506860 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f873c30-bbd6-4e94-9cff-9c998ab92b9c" containerName="install-certs-openstack-openstack-cell1" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.507680 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.511731 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.511852 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.512059 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.513244 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.514182 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.520611 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-sczt2"] Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.643640 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ssh-key\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.643867 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-inventory\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.643910 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.643981 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2259\" (UniqueName: \"kubernetes.io/projected/8d6feded-06a9-476b-8ce0-9856c8ac5de2-kube-api-access-c2259\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.644066 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.746066 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ssh-key\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.746186 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-inventory\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.746220 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.746255 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2259\" (UniqueName: \"kubernetes.io/projected/8d6feded-06a9-476b-8ce0-9856c8ac5de2-kube-api-access-c2259\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.746319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.747744 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.750577 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.752572 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ssh-key\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.753785 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-inventory\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.765388 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2259\" (UniqueName: \"kubernetes.io/projected/8d6feded-06a9-476b-8ce0-9856c8ac5de2-kube-api-access-c2259\") pod \"ovn-openstack-openstack-cell1-sczt2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:01 crc kubenswrapper[4732]: I1010 08:55:01.831454 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:55:02 crc kubenswrapper[4732]: I1010 08:55:02.232546 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-sczt2"] Oct 10 08:55:02 crc kubenswrapper[4732]: I1010 08:55:02.379100 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-sczt2" event={"ID":"8d6feded-06a9-476b-8ce0-9856c8ac5de2","Type":"ContainerStarted","Data":"9b5fb48c3f503f550e96f967b124d948c4bff5d8cddd903016393fd38c26d78e"} Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.413357 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-sczt2" event={"ID":"8d6feded-06a9-476b-8ce0-9856c8ac5de2","Type":"ContainerStarted","Data":"855a4b6969a89e6f015276ed127cf5994e60e0203e78975702b78b5b8e07d603"} Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.421863 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-md26j"] Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.425484 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.456073 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-md26j"] Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.470856 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-sczt2" podStartSLOduration=1.9304066039999999 podStartE2EDuration="2.470827052s" podCreationTimestamp="2025-10-10 08:55:01 +0000 UTC" firstStartedPulling="2025-10-10 08:55:02.24597662 +0000 UTC m=+7429.315567871" lastFinishedPulling="2025-10-10 08:55:02.786397078 +0000 UTC m=+7429.855988319" observedRunningTime="2025-10-10 08:55:03.438830089 +0000 UTC m=+7430.508421350" watchObservedRunningTime="2025-10-10 08:55:03.470827052 +0000 UTC m=+7430.540418293" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.491509 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb27v\" (UniqueName: \"kubernetes.io/projected/f82decbb-6535-44b0-9668-7cb9d4ad5edc-kube-api-access-lb27v\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.491589 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-utilities\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.491982 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-catalog-content\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.593640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb27v\" (UniqueName: \"kubernetes.io/projected/f82decbb-6535-44b0-9668-7cb9d4ad5edc-kube-api-access-lb27v\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.593747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-utilities\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.593828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-catalog-content\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.594399 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-catalog-content\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.594497 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-utilities\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.618012 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb27v\" (UniqueName: \"kubernetes.io/projected/f82decbb-6535-44b0-9668-7cb9d4ad5edc-kube-api-access-lb27v\") pod \"certified-operators-md26j\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:03 crc kubenswrapper[4732]: I1010 08:55:03.770799 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:04 crc kubenswrapper[4732]: I1010 08:55:04.324980 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-md26j"] Oct 10 08:55:04 crc kubenswrapper[4732]: I1010 08:55:04.421381 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-md26j" event={"ID":"f82decbb-6535-44b0-9668-7cb9d4ad5edc","Type":"ContainerStarted","Data":"4f4038e9f5af4678f50eaaac2eec424dc26f523ceecf9d9e47423d0f76b8d0fb"} Oct 10 08:55:05 crc kubenswrapper[4732]: I1010 08:55:05.431613 4732 generic.go:334] "Generic (PLEG): container finished" podID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerID="bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c" exitCode=0 Oct 10 08:55:05 crc kubenswrapper[4732]: I1010 08:55:05.431752 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-md26j" event={"ID":"f82decbb-6535-44b0-9668-7cb9d4ad5edc","Type":"ContainerDied","Data":"bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c"} Oct 10 08:55:06 crc kubenswrapper[4732]: I1010 08:55:06.446403 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-md26j" event={"ID":"f82decbb-6535-44b0-9668-7cb9d4ad5edc","Type":"ContainerStarted","Data":"00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359"} Oct 10 08:55:09 crc kubenswrapper[4732]: I1010 08:55:09.477570 4732 generic.go:334] "Generic (PLEG): container finished" podID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerID="00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359" exitCode=0 Oct 10 08:55:09 crc kubenswrapper[4732]: I1010 08:55:09.477634 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-md26j" event={"ID":"f82decbb-6535-44b0-9668-7cb9d4ad5edc","Type":"ContainerDied","Data":"00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359"} Oct 10 08:55:10 crc kubenswrapper[4732]: I1010 08:55:10.488743 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-md26j" event={"ID":"f82decbb-6535-44b0-9668-7cb9d4ad5edc","Type":"ContainerStarted","Data":"c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f"} Oct 10 08:55:10 crc kubenswrapper[4732]: I1010 08:55:10.509098 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-md26j" podStartSLOduration=3.055295093 podStartE2EDuration="7.509073251s" podCreationTimestamp="2025-10-10 08:55:03 +0000 UTC" firstStartedPulling="2025-10-10 08:55:05.433429457 +0000 UTC m=+7432.503020698" lastFinishedPulling="2025-10-10 08:55:09.887207605 +0000 UTC m=+7436.956798856" observedRunningTime="2025-10-10 08:55:10.506118491 +0000 UTC m=+7437.575709782" watchObservedRunningTime="2025-10-10 08:55:10.509073251 +0000 UTC m=+7437.578664532" Oct 10 08:55:13 crc kubenswrapper[4732]: I1010 08:55:13.772151 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:13 crc kubenswrapper[4732]: I1010 08:55:13.772636 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:13 crc kubenswrapper[4732]: I1010 08:55:13.853434 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:23 crc kubenswrapper[4732]: I1010 08:55:23.839545 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:23 crc kubenswrapper[4732]: I1010 08:55:23.897924 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-md26j"] Oct 10 08:55:24 crc kubenswrapper[4732]: I1010 08:55:24.640686 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-md26j" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="registry-server" containerID="cri-o://c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f" gracePeriod=2 Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.163154 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.255674 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb27v\" (UniqueName: \"kubernetes.io/projected/f82decbb-6535-44b0-9668-7cb9d4ad5edc-kube-api-access-lb27v\") pod \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.255732 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-catalog-content\") pod \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.264445 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82decbb-6535-44b0-9668-7cb9d4ad5edc-kube-api-access-lb27v" (OuterVolumeSpecName: "kube-api-access-lb27v") pod "f82decbb-6535-44b0-9668-7cb9d4ad5edc" (UID: "f82decbb-6535-44b0-9668-7cb9d4ad5edc"). InnerVolumeSpecName "kube-api-access-lb27v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.305091 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f82decbb-6535-44b0-9668-7cb9d4ad5edc" (UID: "f82decbb-6535-44b0-9668-7cb9d4ad5edc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.356534 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.356603 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.357102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-utilities\") pod \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\" (UID: \"f82decbb-6535-44b0-9668-7cb9d4ad5edc\") " Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.357765 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb27v\" (UniqueName: \"kubernetes.io/projected/f82decbb-6535-44b0-9668-7cb9d4ad5edc-kube-api-access-lb27v\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.357784 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.357897 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-utilities" (OuterVolumeSpecName: "utilities") pod "f82decbb-6535-44b0-9668-7cb9d4ad5edc" (UID: "f82decbb-6535-44b0-9668-7cb9d4ad5edc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.459864 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f82decbb-6535-44b0-9668-7cb9d4ad5edc-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.654780 4732 generic.go:334] "Generic (PLEG): container finished" podID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerID="c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f" exitCode=0 Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.654811 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-md26j" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.654859 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-md26j" event={"ID":"f82decbb-6535-44b0-9668-7cb9d4ad5edc","Type":"ContainerDied","Data":"c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f"} Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.654893 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-md26j" event={"ID":"f82decbb-6535-44b0-9668-7cb9d4ad5edc","Type":"ContainerDied","Data":"4f4038e9f5af4678f50eaaac2eec424dc26f523ceecf9d9e47423d0f76b8d0fb"} Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.654926 4732 scope.go:117] "RemoveContainer" containerID="c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.717173 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-md26j"] Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.728710 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-md26j"] Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.730053 4732 scope.go:117] "RemoveContainer" containerID="00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.752358 4732 scope.go:117] "RemoveContainer" containerID="bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.804946 4732 scope.go:117] "RemoveContainer" containerID="c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f" Oct 10 08:55:25 crc kubenswrapper[4732]: E1010 08:55:25.805532 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f\": container with ID starting with c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f not found: ID does not exist" containerID="c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.805567 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f"} err="failed to get container status \"c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f\": rpc error: code = NotFound desc = could not find container \"c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f\": container with ID starting with c0491b89e26abc66e31f2ae20de38af0c78d422167b521dd8ec582d27ae9469f not found: ID does not exist" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.805586 4732 scope.go:117] "RemoveContainer" containerID="00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359" Oct 10 08:55:25 crc kubenswrapper[4732]: E1010 08:55:25.805861 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359\": container with ID starting with 00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359 not found: ID does not exist" containerID="00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.805880 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359"} err="failed to get container status \"00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359\": rpc error: code = NotFound desc = could not find container \"00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359\": container with ID starting with 00e186b40e9e7100ac513ea85225e104446157c3eea253ddb3d62954f7dc2359 not found: ID does not exist" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.805894 4732 scope.go:117] "RemoveContainer" containerID="bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c" Oct 10 08:55:25 crc kubenswrapper[4732]: E1010 08:55:25.806079 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c\": container with ID starting with bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c not found: ID does not exist" containerID="bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c" Oct 10 08:55:25 crc kubenswrapper[4732]: I1010 08:55:25.806102 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c"} err="failed to get container status \"bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c\": rpc error: code = NotFound desc = could not find container \"bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c\": container with ID starting with bfcea179e2476d30d2247ae5d9e6ec223bfb9d0db3113bcf3f21953d4a1cae7c not found: ID does not exist" Oct 10 08:55:27 crc kubenswrapper[4732]: I1010 08:55:27.684425 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" path="/var/lib/kubelet/pods/f82decbb-6535-44b0-9668-7cb9d4ad5edc/volumes" Oct 10 08:55:55 crc kubenswrapper[4732]: I1010 08:55:55.356966 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:55:55 crc kubenswrapper[4732]: I1010 08:55:55.357597 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:56:12 crc kubenswrapper[4732]: I1010 08:56:12.215087 4732 generic.go:334] "Generic (PLEG): container finished" podID="8d6feded-06a9-476b-8ce0-9856c8ac5de2" containerID="855a4b6969a89e6f015276ed127cf5994e60e0203e78975702b78b5b8e07d603" exitCode=0 Oct 10 08:56:12 crc kubenswrapper[4732]: I1010 08:56:12.215199 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-sczt2" event={"ID":"8d6feded-06a9-476b-8ce0-9856c8ac5de2","Type":"ContainerDied","Data":"855a4b6969a89e6f015276ed127cf5994e60e0203e78975702b78b5b8e07d603"} Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.671802 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.817577 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovncontroller-config-0\") pod \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.818035 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ssh-key\") pod \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.818110 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-inventory\") pod \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.818212 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2259\" (UniqueName: \"kubernetes.io/projected/8d6feded-06a9-476b-8ce0-9856c8ac5de2-kube-api-access-c2259\") pod \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.818284 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovn-combined-ca-bundle\") pod \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\" (UID: \"8d6feded-06a9-476b-8ce0-9856c8ac5de2\") " Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.824602 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8d6feded-06a9-476b-8ce0-9856c8ac5de2" (UID: "8d6feded-06a9-476b-8ce0-9856c8ac5de2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.824926 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6feded-06a9-476b-8ce0-9856c8ac5de2-kube-api-access-c2259" (OuterVolumeSpecName: "kube-api-access-c2259") pod "8d6feded-06a9-476b-8ce0-9856c8ac5de2" (UID: "8d6feded-06a9-476b-8ce0-9856c8ac5de2"). InnerVolumeSpecName "kube-api-access-c2259". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.847378 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8d6feded-06a9-476b-8ce0-9856c8ac5de2" (UID: "8d6feded-06a9-476b-8ce0-9856c8ac5de2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.848535 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-inventory" (OuterVolumeSpecName: "inventory") pod "8d6feded-06a9-476b-8ce0-9856c8ac5de2" (UID: "8d6feded-06a9-476b-8ce0-9856c8ac5de2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.855264 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8d6feded-06a9-476b-8ce0-9856c8ac5de2" (UID: "8d6feded-06a9-476b-8ce0-9856c8ac5de2"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.923221 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2259\" (UniqueName: \"kubernetes.io/projected/8d6feded-06a9-476b-8ce0-9856c8ac5de2-kube-api-access-c2259\") on node \"crc\" DevicePath \"\"" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.923273 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.923290 4732 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.923302 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:56:13 crc kubenswrapper[4732]: I1010 08:56:13.923317 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d6feded-06a9-476b-8ce0-9856c8ac5de2-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.247298 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-sczt2" event={"ID":"8d6feded-06a9-476b-8ce0-9856c8ac5de2","Type":"ContainerDied","Data":"9b5fb48c3f503f550e96f967b124d948c4bff5d8cddd903016393fd38c26d78e"} Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.247342 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b5fb48c3f503f550e96f967b124d948c4bff5d8cddd903016393fd38c26d78e" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.247761 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-sczt2" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.371608 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jf2wj"] Oct 10 08:56:14 crc kubenswrapper[4732]: E1010 08:56:14.381431 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="registry-server" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.381588 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="registry-server" Oct 10 08:56:14 crc kubenswrapper[4732]: E1010 08:56:14.381768 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="extract-content" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.381834 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="extract-content" Oct 10 08:56:14 crc kubenswrapper[4732]: E1010 08:56:14.381923 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6feded-06a9-476b-8ce0-9856c8ac5de2" containerName="ovn-openstack-openstack-cell1" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.381983 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6feded-06a9-476b-8ce0-9856c8ac5de2" containerName="ovn-openstack-openstack-cell1" Oct 10 08:56:14 crc kubenswrapper[4732]: E1010 08:56:14.382078 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="extract-utilities" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.382151 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="extract-utilities" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.382603 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82decbb-6535-44b0-9668-7cb9d4ad5edc" containerName="registry-server" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.382725 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6feded-06a9-476b-8ce0-9856c8ac5de2" containerName="ovn-openstack-openstack-cell1" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.384180 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.388591 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jf2wj"] Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.402304 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.402481 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.402969 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.402808 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.403175 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.402911 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.535922 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.536009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.536039 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r7vl\" (UniqueName: \"kubernetes.io/projected/015e3b36-7470-428f-8e34-37a633345b2e-kube-api-access-5r7vl\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.536103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.536122 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.536137 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.637207 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.637355 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.637414 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r7vl\" (UniqueName: \"kubernetes.io/projected/015e3b36-7470-428f-8e34-37a633345b2e-kube-api-access-5r7vl\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.637551 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.637606 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.637643 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.641628 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.641977 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.645808 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.647667 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.648089 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.670156 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r7vl\" (UniqueName: \"kubernetes.io/projected/015e3b36-7470-428f-8e34-37a633345b2e-kube-api-access-5r7vl\") pod \"neutron-metadata-openstack-openstack-cell1-jf2wj\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:14 crc kubenswrapper[4732]: I1010 08:56:14.730459 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:56:15 crc kubenswrapper[4732]: I1010 08:56:15.102625 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-jf2wj"] Oct 10 08:56:15 crc kubenswrapper[4732]: I1010 08:56:15.110783 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 08:56:15 crc kubenswrapper[4732]: I1010 08:56:15.260008 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" event={"ID":"015e3b36-7470-428f-8e34-37a633345b2e","Type":"ContainerStarted","Data":"870eedde7be62dd9baa094d5fb4ec964bf3de16b0b7a3c7f216aceb236631deb"} Oct 10 08:56:17 crc kubenswrapper[4732]: I1010 08:56:17.288240 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" event={"ID":"015e3b36-7470-428f-8e34-37a633345b2e","Type":"ContainerStarted","Data":"9e553b630109aa02a78909bbd8dcf7172f670edc5107165127061b2c15225c76"} Oct 10 08:56:17 crc kubenswrapper[4732]: I1010 08:56:17.321610 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" podStartSLOduration=2.229212789 podStartE2EDuration="3.321584465s" podCreationTimestamp="2025-10-10 08:56:14 +0000 UTC" firstStartedPulling="2025-10-10 08:56:15.110573735 +0000 UTC m=+7502.180164976" lastFinishedPulling="2025-10-10 08:56:16.202945401 +0000 UTC m=+7503.272536652" observedRunningTime="2025-10-10 08:56:17.31360903 +0000 UTC m=+7504.383200301" watchObservedRunningTime="2025-10-10 08:56:17.321584465 +0000 UTC m=+7504.391175726" Oct 10 08:56:25 crc kubenswrapper[4732]: I1010 08:56:25.356209 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:56:25 crc kubenswrapper[4732]: I1010 08:56:25.356764 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:56:25 crc kubenswrapper[4732]: I1010 08:56:25.356810 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:56:25 crc kubenswrapper[4732]: I1010 08:56:25.357576 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0337e2dd4003adeeedb922754f25d102fb276bba03a83e971b1f685a707e8c87"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:56:25 crc kubenswrapper[4732]: I1010 08:56:25.357620 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://0337e2dd4003adeeedb922754f25d102fb276bba03a83e971b1f685a707e8c87" gracePeriod=600 Oct 10 08:56:26 crc kubenswrapper[4732]: I1010 08:56:26.404984 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="0337e2dd4003adeeedb922754f25d102fb276bba03a83e971b1f685a707e8c87" exitCode=0 Oct 10 08:56:26 crc kubenswrapper[4732]: I1010 08:56:26.405060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"0337e2dd4003adeeedb922754f25d102fb276bba03a83e971b1f685a707e8c87"} Oct 10 08:56:26 crc kubenswrapper[4732]: I1010 08:56:26.405248 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334"} Oct 10 08:56:26 crc kubenswrapper[4732]: I1010 08:56:26.405271 4732 scope.go:117] "RemoveContainer" containerID="85979f32dba6d404122810f1d60c55a1ecb4cbb48ed9390666c3976744b6b29b" Oct 10 08:57:12 crc kubenswrapper[4732]: I1010 08:57:12.903179 4732 generic.go:334] "Generic (PLEG): container finished" podID="015e3b36-7470-428f-8e34-37a633345b2e" containerID="9e553b630109aa02a78909bbd8dcf7172f670edc5107165127061b2c15225c76" exitCode=0 Oct 10 08:57:12 crc kubenswrapper[4732]: I1010 08:57:12.903911 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" event={"ID":"015e3b36-7470-428f-8e34-37a633345b2e","Type":"ContainerDied","Data":"9e553b630109aa02a78909bbd8dcf7172f670edc5107165127061b2c15225c76"} Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.421286 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.548473 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-ssh-key\") pod \"015e3b36-7470-428f-8e34-37a633345b2e\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.548652 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r7vl\" (UniqueName: \"kubernetes.io/projected/015e3b36-7470-428f-8e34-37a633345b2e-kube-api-access-5r7vl\") pod \"015e3b36-7470-428f-8e34-37a633345b2e\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.548772 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-metadata-combined-ca-bundle\") pod \"015e3b36-7470-428f-8e34-37a633345b2e\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.548994 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-inventory\") pod \"015e3b36-7470-428f-8e34-37a633345b2e\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.549175 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-nova-metadata-neutron-config-0\") pod \"015e3b36-7470-428f-8e34-37a633345b2e\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.549315 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"015e3b36-7470-428f-8e34-37a633345b2e\" (UID: \"015e3b36-7470-428f-8e34-37a633345b2e\") " Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.567092 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "015e3b36-7470-428f-8e34-37a633345b2e" (UID: "015e3b36-7470-428f-8e34-37a633345b2e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.567114 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015e3b36-7470-428f-8e34-37a633345b2e-kube-api-access-5r7vl" (OuterVolumeSpecName: "kube-api-access-5r7vl") pod "015e3b36-7470-428f-8e34-37a633345b2e" (UID: "015e3b36-7470-428f-8e34-37a633345b2e"). InnerVolumeSpecName "kube-api-access-5r7vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.588093 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "015e3b36-7470-428f-8e34-37a633345b2e" (UID: "015e3b36-7470-428f-8e34-37a633345b2e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.594684 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-inventory" (OuterVolumeSpecName: "inventory") pod "015e3b36-7470-428f-8e34-37a633345b2e" (UID: "015e3b36-7470-428f-8e34-37a633345b2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.611838 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "015e3b36-7470-428f-8e34-37a633345b2e" (UID: "015e3b36-7470-428f-8e34-37a633345b2e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.622187 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "015e3b36-7470-428f-8e34-37a633345b2e" (UID: "015e3b36-7470-428f-8e34-37a633345b2e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.651604 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.651656 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.651675 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r7vl\" (UniqueName: \"kubernetes.io/projected/015e3b36-7470-428f-8e34-37a633345b2e-kube-api-access-5r7vl\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.651725 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.651744 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.651761 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/015e3b36-7470-428f-8e34-37a633345b2e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.925812 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" event={"ID":"015e3b36-7470-428f-8e34-37a633345b2e","Type":"ContainerDied","Data":"870eedde7be62dd9baa094d5fb4ec964bf3de16b0b7a3c7f216aceb236631deb"} Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.926059 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="870eedde7be62dd9baa094d5fb4ec964bf3de16b0b7a3c7f216aceb236631deb" Oct 10 08:57:14 crc kubenswrapper[4732]: I1010 08:57:14.925947 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-jf2wj" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.069849 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-vqcwk"] Oct 10 08:57:15 crc kubenswrapper[4732]: E1010 08:57:15.070251 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015e3b36-7470-428f-8e34-37a633345b2e" containerName="neutron-metadata-openstack-openstack-cell1" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.070268 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="015e3b36-7470-428f-8e34-37a633345b2e" containerName="neutron-metadata-openstack-openstack-cell1" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.070474 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="015e3b36-7470-428f-8e34-37a633345b2e" containerName="neutron-metadata-openstack-openstack-cell1" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.071181 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.075179 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.075317 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.075320 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.075392 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.075521 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.089978 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-vqcwk"] Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.264670 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-ssh-key\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.264733 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.265035 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-inventory\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.265102 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjfr\" (UniqueName: \"kubernetes.io/projected/e0f22220-e678-461f-b4bf-fd1c0415a490-kube-api-access-dtjfr\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.265273 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.367206 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-inventory\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.367312 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjfr\" (UniqueName: \"kubernetes.io/projected/e0f22220-e678-461f-b4bf-fd1c0415a490-kube-api-access-dtjfr\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.367358 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.367423 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-ssh-key\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.367442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.372581 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-inventory\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.373103 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.376281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.383794 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-ssh-key\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.390886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjfr\" (UniqueName: \"kubernetes.io/projected/e0f22220-e678-461f-b4bf-fd1c0415a490-kube-api-access-dtjfr\") pod \"libvirt-openstack-openstack-cell1-vqcwk\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:15 crc kubenswrapper[4732]: I1010 08:57:15.687371 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 08:57:18 crc kubenswrapper[4732]: I1010 08:57:18.657922 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-vqcwk"] Oct 10 08:57:18 crc kubenswrapper[4732]: I1010 08:57:18.969772 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" event={"ID":"e0f22220-e678-461f-b4bf-fd1c0415a490","Type":"ContainerStarted","Data":"1d5ebc2ca852987dbdb5a082da16ae94ab7a110b3e4a24d809ef1657e85e0996"} Oct 10 08:57:20 crc kubenswrapper[4732]: I1010 08:57:20.994106 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" event={"ID":"e0f22220-e678-461f-b4bf-fd1c0415a490","Type":"ContainerStarted","Data":"693a87a47a8ae3d5a881bdea67c72db5b828835e977b2f20bed8a2f1ed1074ad"} Oct 10 08:57:21 crc kubenswrapper[4732]: I1010 08:57:21.030500 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" podStartSLOduration=4.4141222540000005 podStartE2EDuration="6.030480124s" podCreationTimestamp="2025-10-10 08:57:15 +0000 UTC" firstStartedPulling="2025-10-10 08:57:18.663133064 +0000 UTC m=+7565.732724305" lastFinishedPulling="2025-10-10 08:57:20.279490934 +0000 UTC m=+7567.349082175" observedRunningTime="2025-10-10 08:57:21.022397026 +0000 UTC m=+7568.091988267" watchObservedRunningTime="2025-10-10 08:57:21.030480124 +0000 UTC m=+7568.100071365" Oct 10 08:58:25 crc kubenswrapper[4732]: I1010 08:58:25.355799 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:58:25 crc kubenswrapper[4732]: I1010 08:58:25.356535 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.020399 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-99sxz"] Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.023256 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.102913 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99sxz"] Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.185589 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc49c\" (UniqueName: \"kubernetes.io/projected/ac5f8834-7026-409f-bc26-78000ce86327-kube-api-access-rc49c\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.185656 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-catalog-content\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.185745 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-utilities\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.287916 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-utilities\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.288077 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc49c\" (UniqueName: \"kubernetes.io/projected/ac5f8834-7026-409f-bc26-78000ce86327-kube-api-access-rc49c\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.288106 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-catalog-content\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.288653 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-catalog-content\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.288883 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-utilities\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.310779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc49c\" (UniqueName: \"kubernetes.io/projected/ac5f8834-7026-409f-bc26-78000ce86327-kube-api-access-rc49c\") pod \"redhat-operators-99sxz\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.356214 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.356290 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.362823 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:58:55 crc kubenswrapper[4732]: I1010 08:58:55.902287 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-99sxz"] Oct 10 08:58:56 crc kubenswrapper[4732]: I1010 08:58:56.086399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99sxz" event={"ID":"ac5f8834-7026-409f-bc26-78000ce86327","Type":"ContainerStarted","Data":"374ed13a386ece798cf95a5da4f507fcc82d8a9c4b1499b500697be2bc876506"} Oct 10 08:58:57 crc kubenswrapper[4732]: I1010 08:58:57.097929 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac5f8834-7026-409f-bc26-78000ce86327" containerID="4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602" exitCode=0 Oct 10 08:58:57 crc kubenswrapper[4732]: I1010 08:58:57.097982 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99sxz" event={"ID":"ac5f8834-7026-409f-bc26-78000ce86327","Type":"ContainerDied","Data":"4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602"} Oct 10 08:58:59 crc kubenswrapper[4732]: I1010 08:58:59.140574 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99sxz" event={"ID":"ac5f8834-7026-409f-bc26-78000ce86327","Type":"ContainerStarted","Data":"6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1"} Oct 10 08:59:03 crc kubenswrapper[4732]: I1010 08:59:03.198838 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac5f8834-7026-409f-bc26-78000ce86327" containerID="6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1" exitCode=0 Oct 10 08:59:03 crc kubenswrapper[4732]: I1010 08:59:03.198978 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99sxz" event={"ID":"ac5f8834-7026-409f-bc26-78000ce86327","Type":"ContainerDied","Data":"6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1"} Oct 10 08:59:04 crc kubenswrapper[4732]: I1010 08:59:04.211255 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99sxz" event={"ID":"ac5f8834-7026-409f-bc26-78000ce86327","Type":"ContainerStarted","Data":"e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2"} Oct 10 08:59:04 crc kubenswrapper[4732]: I1010 08:59:04.233752 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-99sxz" podStartSLOduration=3.69407076 podStartE2EDuration="10.23373464s" podCreationTimestamp="2025-10-10 08:58:54 +0000 UTC" firstStartedPulling="2025-10-10 08:58:57.102359888 +0000 UTC m=+7664.171951129" lastFinishedPulling="2025-10-10 08:59:03.642023768 +0000 UTC m=+7670.711615009" observedRunningTime="2025-10-10 08:59:04.228843188 +0000 UTC m=+7671.298434429" watchObservedRunningTime="2025-10-10 08:59:04.23373464 +0000 UTC m=+7671.303325881" Oct 10 08:59:05 crc kubenswrapper[4732]: I1010 08:59:05.364070 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:59:05 crc kubenswrapper[4732]: I1010 08:59:05.364427 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:59:06 crc kubenswrapper[4732]: I1010 08:59:06.420980 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-99sxz" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="registry-server" probeResult="failure" output=< Oct 10 08:59:06 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 08:59:06 crc kubenswrapper[4732]: > Oct 10 08:59:16 crc kubenswrapper[4732]: I1010 08:59:16.425959 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-99sxz" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="registry-server" probeResult="failure" output=< Oct 10 08:59:16 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 08:59:16 crc kubenswrapper[4732]: > Oct 10 08:59:25 crc kubenswrapper[4732]: I1010 08:59:25.356048 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 08:59:25 crc kubenswrapper[4732]: I1010 08:59:25.356552 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 08:59:25 crc kubenswrapper[4732]: I1010 08:59:25.356601 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 08:59:25 crc kubenswrapper[4732]: I1010 08:59:25.357111 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 08:59:25 crc kubenswrapper[4732]: I1010 08:59:25.357151 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" gracePeriod=600 Oct 10 08:59:25 crc kubenswrapper[4732]: I1010 08:59:25.441853 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:59:25 crc kubenswrapper[4732]: E1010 08:59:25.493369 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:59:25 crc kubenswrapper[4732]: I1010 08:59:25.500484 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:59:26 crc kubenswrapper[4732]: I1010 08:59:26.230472 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99sxz"] Oct 10 08:59:26 crc kubenswrapper[4732]: I1010 08:59:26.471067 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" exitCode=0 Oct 10 08:59:26 crc kubenswrapper[4732]: I1010 08:59:26.471136 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334"} Oct 10 08:59:26 crc kubenswrapper[4732]: I1010 08:59:26.471198 4732 scope.go:117] "RemoveContainer" containerID="0337e2dd4003adeeedb922754f25d102fb276bba03a83e971b1f685a707e8c87" Oct 10 08:59:26 crc kubenswrapper[4732]: I1010 08:59:26.472405 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 08:59:26 crc kubenswrapper[4732]: E1010 08:59:26.472943 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:59:27 crc kubenswrapper[4732]: I1010 08:59:27.495611 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-99sxz" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="registry-server" containerID="cri-o://e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2" gracePeriod=2 Oct 10 08:59:27 crc kubenswrapper[4732]: E1010 08:59:27.778941 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5f8834_7026_409f_bc26_78000ce86327.slice/crio-e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac5f8834_7026_409f_bc26_78000ce86327.slice/crio-conmon-e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2.scope\": RecentStats: unable to find data in memory cache]" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.007605 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.123319 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-catalog-content\") pod \"ac5f8834-7026-409f-bc26-78000ce86327\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.123425 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc49c\" (UniqueName: \"kubernetes.io/projected/ac5f8834-7026-409f-bc26-78000ce86327-kube-api-access-rc49c\") pod \"ac5f8834-7026-409f-bc26-78000ce86327\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.123499 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-utilities\") pod \"ac5f8834-7026-409f-bc26-78000ce86327\" (UID: \"ac5f8834-7026-409f-bc26-78000ce86327\") " Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.125031 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-utilities" (OuterVolumeSpecName: "utilities") pod "ac5f8834-7026-409f-bc26-78000ce86327" (UID: "ac5f8834-7026-409f-bc26-78000ce86327"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.129129 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac5f8834-7026-409f-bc26-78000ce86327-kube-api-access-rc49c" (OuterVolumeSpecName: "kube-api-access-rc49c") pod "ac5f8834-7026-409f-bc26-78000ce86327" (UID: "ac5f8834-7026-409f-bc26-78000ce86327"). InnerVolumeSpecName "kube-api-access-rc49c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.213599 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac5f8834-7026-409f-bc26-78000ce86327" (UID: "ac5f8834-7026-409f-bc26-78000ce86327"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.226107 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.226265 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc49c\" (UniqueName: \"kubernetes.io/projected/ac5f8834-7026-409f-bc26-78000ce86327-kube-api-access-rc49c\") on node \"crc\" DevicePath \"\"" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.226338 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac5f8834-7026-409f-bc26-78000ce86327-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.507156 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac5f8834-7026-409f-bc26-78000ce86327" containerID="e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2" exitCode=0 Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.507200 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99sxz" event={"ID":"ac5f8834-7026-409f-bc26-78000ce86327","Type":"ContainerDied","Data":"e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2"} Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.507233 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-99sxz" event={"ID":"ac5f8834-7026-409f-bc26-78000ce86327","Type":"ContainerDied","Data":"374ed13a386ece798cf95a5da4f507fcc82d8a9c4b1499b500697be2bc876506"} Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.507249 4732 scope.go:117] "RemoveContainer" containerID="e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.507244 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-99sxz" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.542850 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-99sxz"] Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.550837 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-99sxz"] Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.551129 4732 scope.go:117] "RemoveContainer" containerID="6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.587565 4732 scope.go:117] "RemoveContainer" containerID="4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.623809 4732 scope.go:117] "RemoveContainer" containerID="e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2" Oct 10 08:59:28 crc kubenswrapper[4732]: E1010 08:59:28.624287 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2\": container with ID starting with e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2 not found: ID does not exist" containerID="e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.624342 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2"} err="failed to get container status \"e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2\": rpc error: code = NotFound desc = could not find container \"e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2\": container with ID starting with e4b43149b9c8dbbb1ced3945db3c8651d6ea88145e6e63318043d8dd4c5ec7d2 not found: ID does not exist" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.624374 4732 scope.go:117] "RemoveContainer" containerID="6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1" Oct 10 08:59:28 crc kubenswrapper[4732]: E1010 08:59:28.624739 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1\": container with ID starting with 6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1 not found: ID does not exist" containerID="6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.624798 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1"} err="failed to get container status \"6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1\": rpc error: code = NotFound desc = could not find container \"6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1\": container with ID starting with 6f40cac8ec7d85cfa9efe1d03575b0f6f65326903c390e6c714df1fc39bb5ac1 not found: ID does not exist" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.624836 4732 scope.go:117] "RemoveContainer" containerID="4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602" Oct 10 08:59:28 crc kubenswrapper[4732]: E1010 08:59:28.625150 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602\": container with ID starting with 4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602 not found: ID does not exist" containerID="4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602" Oct 10 08:59:28 crc kubenswrapper[4732]: I1010 08:59:28.625193 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602"} err="failed to get container status \"4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602\": rpc error: code = NotFound desc = could not find container \"4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602\": container with ID starting with 4ee3e33f02cfbb2a978ca00f0fdef9718852fda6830ee28a6674e208acea2602 not found: ID does not exist" Oct 10 08:59:29 crc kubenswrapper[4732]: I1010 08:59:29.677110 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac5f8834-7026-409f-bc26-78000ce86327" path="/var/lib/kubelet/pods/ac5f8834-7026-409f-bc26-78000ce86327/volumes" Oct 10 08:59:39 crc kubenswrapper[4732]: I1010 08:59:39.661040 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 08:59:39 crc kubenswrapper[4732]: E1010 08:59:39.662321 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 08:59:50 crc kubenswrapper[4732]: I1010 08:59:50.660535 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 08:59:50 crc kubenswrapper[4732]: E1010 08:59:50.661196 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.182791 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt"] Oct 10 09:00:00 crc kubenswrapper[4732]: E1010 09:00:00.185254 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="registry-server" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.185367 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="registry-server" Oct 10 09:00:00 crc kubenswrapper[4732]: E1010 09:00:00.185459 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="extract-utilities" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.185534 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="extract-utilities" Oct 10 09:00:00 crc kubenswrapper[4732]: E1010 09:00:00.185611 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="extract-content" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.185680 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="extract-content" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.186120 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac5f8834-7026-409f-bc26-78000ce86327" containerName="registry-server" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.187258 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.190995 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.197081 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.229418 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt"] Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.298972 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/324b63ee-f5a8-4cde-863f-450ffae67192-config-volume\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.299098 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/324b63ee-f5a8-4cde-863f-450ffae67192-secret-volume\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.299280 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mgw\" (UniqueName: \"kubernetes.io/projected/324b63ee-f5a8-4cde-863f-450ffae67192-kube-api-access-m7mgw\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.401319 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/324b63ee-f5a8-4cde-863f-450ffae67192-secret-volume\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.401442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mgw\" (UniqueName: \"kubernetes.io/projected/324b63ee-f5a8-4cde-863f-450ffae67192-kube-api-access-m7mgw\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.401516 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/324b63ee-f5a8-4cde-863f-450ffae67192-config-volume\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.402345 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/324b63ee-f5a8-4cde-863f-450ffae67192-config-volume\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.406803 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/324b63ee-f5a8-4cde-863f-450ffae67192-secret-volume\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.420423 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mgw\" (UniqueName: \"kubernetes.io/projected/324b63ee-f5a8-4cde-863f-450ffae67192-kube-api-access-m7mgw\") pod \"collect-profiles-29334780-rtgxt\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:00 crc kubenswrapper[4732]: I1010 09:00:00.537157 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:01 crc kubenswrapper[4732]: I1010 09:00:01.040119 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt"] Oct 10 09:00:01 crc kubenswrapper[4732]: I1010 09:00:01.917552 4732 generic.go:334] "Generic (PLEG): container finished" podID="324b63ee-f5a8-4cde-863f-450ffae67192" containerID="b4683358975482982cf85e245aa1a4e3853cfa0ac3fa093cf226fb5881dd46c7" exitCode=0 Oct 10 09:00:01 crc kubenswrapper[4732]: I1010 09:00:01.917666 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" event={"ID":"324b63ee-f5a8-4cde-863f-450ffae67192","Type":"ContainerDied","Data":"b4683358975482982cf85e245aa1a4e3853cfa0ac3fa093cf226fb5881dd46c7"} Oct 10 09:00:01 crc kubenswrapper[4732]: I1010 09:00:01.917947 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" event={"ID":"324b63ee-f5a8-4cde-863f-450ffae67192","Type":"ContainerStarted","Data":"40d8f4128bcf23e606a924cb95148c28c069cc8590d13750cd92c0b383e6aaa8"} Oct 10 09:00:02 crc kubenswrapper[4732]: I1010 09:00:02.660814 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:00:02 crc kubenswrapper[4732]: E1010 09:00:02.661398 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.341843 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.470809 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/324b63ee-f5a8-4cde-863f-450ffae67192-secret-volume\") pod \"324b63ee-f5a8-4cde-863f-450ffae67192\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.470879 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7mgw\" (UniqueName: \"kubernetes.io/projected/324b63ee-f5a8-4cde-863f-450ffae67192-kube-api-access-m7mgw\") pod \"324b63ee-f5a8-4cde-863f-450ffae67192\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.470988 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/324b63ee-f5a8-4cde-863f-450ffae67192-config-volume\") pod \"324b63ee-f5a8-4cde-863f-450ffae67192\" (UID: \"324b63ee-f5a8-4cde-863f-450ffae67192\") " Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.471872 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/324b63ee-f5a8-4cde-863f-450ffae67192-config-volume" (OuterVolumeSpecName: "config-volume") pod "324b63ee-f5a8-4cde-863f-450ffae67192" (UID: "324b63ee-f5a8-4cde-863f-450ffae67192"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.476490 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/324b63ee-f5a8-4cde-863f-450ffae67192-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "324b63ee-f5a8-4cde-863f-450ffae67192" (UID: "324b63ee-f5a8-4cde-863f-450ffae67192"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.477235 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324b63ee-f5a8-4cde-863f-450ffae67192-kube-api-access-m7mgw" (OuterVolumeSpecName: "kube-api-access-m7mgw") pod "324b63ee-f5a8-4cde-863f-450ffae67192" (UID: "324b63ee-f5a8-4cde-863f-450ffae67192"). InnerVolumeSpecName "kube-api-access-m7mgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.573347 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/324b63ee-f5a8-4cde-863f-450ffae67192-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.573383 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7mgw\" (UniqueName: \"kubernetes.io/projected/324b63ee-f5a8-4cde-863f-450ffae67192-kube-api-access-m7mgw\") on node \"crc\" DevicePath \"\"" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.573395 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/324b63ee-f5a8-4cde-863f-450ffae67192-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.945319 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" event={"ID":"324b63ee-f5a8-4cde-863f-450ffae67192","Type":"ContainerDied","Data":"40d8f4128bcf23e606a924cb95148c28c069cc8590d13750cd92c0b383e6aaa8"} Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.945362 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d8f4128bcf23e606a924cb95148c28c069cc8590d13750cd92c0b383e6aaa8" Oct 10 09:00:03 crc kubenswrapper[4732]: I1010 09:00:03.945412 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt" Oct 10 09:00:04 crc kubenswrapper[4732]: I1010 09:00:04.445592 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7"] Oct 10 09:00:04 crc kubenswrapper[4732]: I1010 09:00:04.454988 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334735-hnhr7"] Oct 10 09:00:05 crc kubenswrapper[4732]: I1010 09:00:05.679200 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c" path="/var/lib/kubelet/pods/49e9ead5-b4dc-4ecf-97fb-c190a8f79d0c/volumes" Oct 10 09:00:14 crc kubenswrapper[4732]: I1010 09:00:14.660883 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:00:14 crc kubenswrapper[4732]: E1010 09:00:14.662402 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:00:29 crc kubenswrapper[4732]: I1010 09:00:29.660669 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:00:29 crc kubenswrapper[4732]: E1010 09:00:29.661459 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:00:38 crc kubenswrapper[4732]: I1010 09:00:38.188781 4732 scope.go:117] "RemoveContainer" containerID="17ecf4b8ed3713739ca10fb0b787ff3b55b05ee3b2855b7d90b4ff2b25f53c9b" Oct 10 09:00:43 crc kubenswrapper[4732]: I1010 09:00:43.670311 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:00:43 crc kubenswrapper[4732]: E1010 09:00:43.672192 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:00:55 crc kubenswrapper[4732]: I1010 09:00:55.661652 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:00:55 crc kubenswrapper[4732]: E1010 09:00:55.663073 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.151026 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29334781-sb8rl"] Oct 10 09:01:00 crc kubenswrapper[4732]: E1010 09:01:00.151956 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324b63ee-f5a8-4cde-863f-450ffae67192" containerName="collect-profiles" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.151970 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="324b63ee-f5a8-4cde-863f-450ffae67192" containerName="collect-profiles" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.152188 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="324b63ee-f5a8-4cde-863f-450ffae67192" containerName="collect-profiles" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.153136 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.173227 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334781-sb8rl"] Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.213905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rljcc\" (UniqueName: \"kubernetes.io/projected/c4a4459f-b36d-49c9-9444-e1481c7d1087-kube-api-access-rljcc\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.214006 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-combined-ca-bundle\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.214041 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-fernet-keys\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.214129 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-config-data\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.315666 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-config-data\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.315829 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rljcc\" (UniqueName: \"kubernetes.io/projected/c4a4459f-b36d-49c9-9444-e1481c7d1087-kube-api-access-rljcc\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.315891 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-combined-ca-bundle\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.315918 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-fernet-keys\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.322558 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-combined-ca-bundle\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.322805 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-config-data\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.323234 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-fernet-keys\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.339104 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rljcc\" (UniqueName: \"kubernetes.io/projected/c4a4459f-b36d-49c9-9444-e1481c7d1087-kube-api-access-rljcc\") pod \"keystone-cron-29334781-sb8rl\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.475106 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:00 crc kubenswrapper[4732]: I1010 09:01:00.915447 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29334781-sb8rl"] Oct 10 09:01:01 crc kubenswrapper[4732]: I1010 09:01:01.617725 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-sb8rl" event={"ID":"c4a4459f-b36d-49c9-9444-e1481c7d1087","Type":"ContainerStarted","Data":"af33e56ed07ed63716dc5fb6bca2e2226bc8b8267ff3e1304b40d32050262404"} Oct 10 09:01:01 crc kubenswrapper[4732]: I1010 09:01:01.618098 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-sb8rl" event={"ID":"c4a4459f-b36d-49c9-9444-e1481c7d1087","Type":"ContainerStarted","Data":"fedb93e9cdab970eeef4c37c6dfe29f5f7da6d059cf31e8e80c09e65a888dc13"} Oct 10 09:01:01 crc kubenswrapper[4732]: I1010 09:01:01.641807 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29334781-sb8rl" podStartSLOduration=1.6417862159999999 podStartE2EDuration="1.641786216s" podCreationTimestamp="2025-10-10 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:01:01.635544616 +0000 UTC m=+7788.705135867" watchObservedRunningTime="2025-10-10 09:01:01.641786216 +0000 UTC m=+7788.711377447" Oct 10 09:01:06 crc kubenswrapper[4732]: I1010 09:01:06.661369 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:01:06 crc kubenswrapper[4732]: E1010 09:01:06.662102 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:01:09 crc kubenswrapper[4732]: I1010 09:01:09.717187 4732 generic.go:334] "Generic (PLEG): container finished" podID="c4a4459f-b36d-49c9-9444-e1481c7d1087" containerID="af33e56ed07ed63716dc5fb6bca2e2226bc8b8267ff3e1304b40d32050262404" exitCode=0 Oct 10 09:01:09 crc kubenswrapper[4732]: I1010 09:01:09.717398 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-sb8rl" event={"ID":"c4a4459f-b36d-49c9-9444-e1481c7d1087","Type":"ContainerDied","Data":"af33e56ed07ed63716dc5fb6bca2e2226bc8b8267ff3e1304b40d32050262404"} Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.080961 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.170637 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-fernet-keys\") pod \"c4a4459f-b36d-49c9-9444-e1481c7d1087\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.170825 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-combined-ca-bundle\") pod \"c4a4459f-b36d-49c9-9444-e1481c7d1087\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.170953 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rljcc\" (UniqueName: \"kubernetes.io/projected/c4a4459f-b36d-49c9-9444-e1481c7d1087-kube-api-access-rljcc\") pod \"c4a4459f-b36d-49c9-9444-e1481c7d1087\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.171015 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-config-data\") pod \"c4a4459f-b36d-49c9-9444-e1481c7d1087\" (UID: \"c4a4459f-b36d-49c9-9444-e1481c7d1087\") " Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.178339 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a4459f-b36d-49c9-9444-e1481c7d1087-kube-api-access-rljcc" (OuterVolumeSpecName: "kube-api-access-rljcc") pod "c4a4459f-b36d-49c9-9444-e1481c7d1087" (UID: "c4a4459f-b36d-49c9-9444-e1481c7d1087"). InnerVolumeSpecName "kube-api-access-rljcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.190295 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c4a4459f-b36d-49c9-9444-e1481c7d1087" (UID: "c4a4459f-b36d-49c9-9444-e1481c7d1087"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.212878 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4a4459f-b36d-49c9-9444-e1481c7d1087" (UID: "c4a4459f-b36d-49c9-9444-e1481c7d1087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.233285 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-config-data" (OuterVolumeSpecName: "config-data") pod "c4a4459f-b36d-49c9-9444-e1481c7d1087" (UID: "c4a4459f-b36d-49c9-9444-e1481c7d1087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.273594 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.273627 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rljcc\" (UniqueName: \"kubernetes.io/projected/c4a4459f-b36d-49c9-9444-e1481c7d1087-kube-api-access-rljcc\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.273641 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.273655 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c4a4459f-b36d-49c9-9444-e1481c7d1087-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.757021 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29334781-sb8rl" event={"ID":"c4a4459f-b36d-49c9-9444-e1481c7d1087","Type":"ContainerDied","Data":"fedb93e9cdab970eeef4c37c6dfe29f5f7da6d059cf31e8e80c09e65a888dc13"} Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.757062 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fedb93e9cdab970eeef4c37c6dfe29f5f7da6d059cf31e8e80c09e65a888dc13" Oct 10 09:01:11 crc kubenswrapper[4732]: I1010 09:01:11.757121 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29334781-sb8rl" Oct 10 09:01:18 crc kubenswrapper[4732]: I1010 09:01:18.661197 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:01:18 crc kubenswrapper[4732]: E1010 09:01:18.663854 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:01:32 crc kubenswrapper[4732]: I1010 09:01:32.660464 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:01:32 crc kubenswrapper[4732]: E1010 09:01:32.661087 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:01:44 crc kubenswrapper[4732]: I1010 09:01:44.665714 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:01:44 crc kubenswrapper[4732]: E1010 09:01:44.679863 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:01:58 crc kubenswrapper[4732]: I1010 09:01:58.660780 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:01:58 crc kubenswrapper[4732]: E1010 09:01:58.661589 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:02:05 crc kubenswrapper[4732]: I1010 09:02:05.322903 4732 generic.go:334] "Generic (PLEG): container finished" podID="e0f22220-e678-461f-b4bf-fd1c0415a490" containerID="693a87a47a8ae3d5a881bdea67c72db5b828835e977b2f20bed8a2f1ed1074ad" exitCode=0 Oct 10 09:02:05 crc kubenswrapper[4732]: I1010 09:02:05.323002 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" event={"ID":"e0f22220-e678-461f-b4bf-fd1c0415a490","Type":"ContainerDied","Data":"693a87a47a8ae3d5a881bdea67c72db5b828835e977b2f20bed8a2f1ed1074ad"} Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.801846 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.976198 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-secret-0\") pod \"e0f22220-e678-461f-b4bf-fd1c0415a490\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.976277 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-combined-ca-bundle\") pod \"e0f22220-e678-461f-b4bf-fd1c0415a490\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.976355 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-ssh-key\") pod \"e0f22220-e678-461f-b4bf-fd1c0415a490\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.976499 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtjfr\" (UniqueName: \"kubernetes.io/projected/e0f22220-e678-461f-b4bf-fd1c0415a490-kube-api-access-dtjfr\") pod \"e0f22220-e678-461f-b4bf-fd1c0415a490\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.976535 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-inventory\") pod \"e0f22220-e678-461f-b4bf-fd1c0415a490\" (UID: \"e0f22220-e678-461f-b4bf-fd1c0415a490\") " Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.984947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e0f22220-e678-461f-b4bf-fd1c0415a490" (UID: "e0f22220-e678-461f-b4bf-fd1c0415a490"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:02:06 crc kubenswrapper[4732]: I1010 09:02:06.989349 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f22220-e678-461f-b4bf-fd1c0415a490-kube-api-access-dtjfr" (OuterVolumeSpecName: "kube-api-access-dtjfr") pod "e0f22220-e678-461f-b4bf-fd1c0415a490" (UID: "e0f22220-e678-461f-b4bf-fd1c0415a490"). InnerVolumeSpecName "kube-api-access-dtjfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.007615 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e0f22220-e678-461f-b4bf-fd1c0415a490" (UID: "e0f22220-e678-461f-b4bf-fd1c0415a490"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.009683 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-inventory" (OuterVolumeSpecName: "inventory") pod "e0f22220-e678-461f-b4bf-fd1c0415a490" (UID: "e0f22220-e678-461f-b4bf-fd1c0415a490"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.014944 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e0f22220-e678-461f-b4bf-fd1c0415a490" (UID: "e0f22220-e678-461f-b4bf-fd1c0415a490"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.079267 4732 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.079301 4732 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.079313 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.079322 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtjfr\" (UniqueName: \"kubernetes.io/projected/e0f22220-e678-461f-b4bf-fd1c0415a490-kube-api-access-dtjfr\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.079331 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e0f22220-e678-461f-b4bf-fd1c0415a490-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.347468 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" event={"ID":"e0f22220-e678-461f-b4bf-fd1c0415a490","Type":"ContainerDied","Data":"1d5ebc2ca852987dbdb5a082da16ae94ab7a110b3e4a24d809ef1657e85e0996"} Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.347541 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5ebc2ca852987dbdb5a082da16ae94ab7a110b3e4a24d809ef1657e85e0996" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.347585 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vqcwk" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.447166 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-7j7kl"] Oct 10 09:02:07 crc kubenswrapper[4732]: E1010 09:02:07.447600 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f22220-e678-461f-b4bf-fd1c0415a490" containerName="libvirt-openstack-openstack-cell1" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.447621 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f22220-e678-461f-b4bf-fd1c0415a490" containerName="libvirt-openstack-openstack-cell1" Oct 10 09:02:07 crc kubenswrapper[4732]: E1010 09:02:07.447654 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a4459f-b36d-49c9-9444-e1481c7d1087" containerName="keystone-cron" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.447661 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a4459f-b36d-49c9-9444-e1481c7d1087" containerName="keystone-cron" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.447871 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a4459f-b36d-49c9-9444-e1481c7d1087" containerName="keystone-cron" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.447893 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f22220-e678-461f-b4bf-fd1c0415a490" containerName="libvirt-openstack-openstack-cell1" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.448634 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.451819 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.455821 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.455993 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.456505 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.456610 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.456736 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.456855 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.460910 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-7j7kl"] Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.488935 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg4dq\" (UniqueName: \"kubernetes.io/projected/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-kube-api-access-fg4dq\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489036 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489059 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489083 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489124 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489210 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489236 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-inventory\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489266 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.489284 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.590649 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-inventory\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.590736 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.590767 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.590809 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4dq\" (UniqueName: \"kubernetes.io/projected/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-kube-api-access-fg4dq\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.590897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.590922 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.590954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.591009 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.591068 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.592270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.595400 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.596170 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.596450 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.596896 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.597079 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.597669 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-inventory\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.598361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.607655 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg4dq\" (UniqueName: \"kubernetes.io/projected/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-kube-api-access-fg4dq\") pod \"nova-cell1-openstack-openstack-cell1-7j7kl\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:07 crc kubenswrapper[4732]: I1010 09:02:07.769401 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:02:08 crc kubenswrapper[4732]: I1010 09:02:08.355493 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-7j7kl"] Oct 10 09:02:08 crc kubenswrapper[4732]: I1010 09:02:08.367894 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:02:09 crc kubenswrapper[4732]: I1010 09:02:09.372934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" event={"ID":"dfd6910b-38c8-4a2c-9095-16c82b02e3e0","Type":"ContainerStarted","Data":"1b32fb6eae6e75e2a932724f1ab4240b3995da67cc64613661f4a5497cbf2907"} Oct 10 09:02:09 crc kubenswrapper[4732]: I1010 09:02:09.373265 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" event={"ID":"dfd6910b-38c8-4a2c-9095-16c82b02e3e0","Type":"ContainerStarted","Data":"c198447485038b1f164647e6c9ad1a5ad0d0861cdd0887b5996a2f6cd9ad4d8f"} Oct 10 09:02:09 crc kubenswrapper[4732]: I1010 09:02:09.401177 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" podStartSLOduration=1.688366598 podStartE2EDuration="2.401155599s" podCreationTimestamp="2025-10-10 09:02:07 +0000 UTC" firstStartedPulling="2025-10-10 09:02:08.367620229 +0000 UTC m=+7855.437211470" lastFinishedPulling="2025-10-10 09:02:09.08040919 +0000 UTC m=+7856.150000471" observedRunningTime="2025-10-10 09:02:09.396392639 +0000 UTC m=+7856.465983880" watchObservedRunningTime="2025-10-10 09:02:09.401155599 +0000 UTC m=+7856.470746850" Oct 10 09:02:09 crc kubenswrapper[4732]: I1010 09:02:09.660484 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:02:09 crc kubenswrapper[4732]: E1010 09:02:09.660792 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:02:23 crc kubenswrapper[4732]: I1010 09:02:23.673508 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:02:23 crc kubenswrapper[4732]: E1010 09:02:23.674376 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:02:34 crc kubenswrapper[4732]: I1010 09:02:34.661040 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:02:34 crc kubenswrapper[4732]: E1010 09:02:34.661912 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.571665 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bvpcf"] Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.574904 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.582915 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvpcf"] Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.671164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-catalog-content\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.671315 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-utilities\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.671663 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsddc\" (UniqueName: \"kubernetes.io/projected/84935670-75d6-471d-9b23-a55456e553f8-kube-api-access-jsddc\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.773072 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsddc\" (UniqueName: \"kubernetes.io/projected/84935670-75d6-471d-9b23-a55456e553f8-kube-api-access-jsddc\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.773874 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-catalog-content\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.774542 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-utilities\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.774678 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-catalog-content\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.774982 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-utilities\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.795672 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsddc\" (UniqueName: \"kubernetes.io/projected/84935670-75d6-471d-9b23-a55456e553f8-kube-api-access-jsddc\") pod \"redhat-marketplace-bvpcf\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:42 crc kubenswrapper[4732]: I1010 09:02:42.905232 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:43 crc kubenswrapper[4732]: I1010 09:02:43.212684 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvpcf"] Oct 10 09:02:43 crc kubenswrapper[4732]: I1010 09:02:43.769748 4732 generic.go:334] "Generic (PLEG): container finished" podID="84935670-75d6-471d-9b23-a55456e553f8" containerID="35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e" exitCode=0 Oct 10 09:02:43 crc kubenswrapper[4732]: I1010 09:02:43.769815 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvpcf" event={"ID":"84935670-75d6-471d-9b23-a55456e553f8","Type":"ContainerDied","Data":"35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e"} Oct 10 09:02:43 crc kubenswrapper[4732]: I1010 09:02:43.769906 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvpcf" event={"ID":"84935670-75d6-471d-9b23-a55456e553f8","Type":"ContainerStarted","Data":"0ba3fc22791ff72b1828dd85003acd8af800d8cf95f941483e29bb7be8e914cd"} Oct 10 09:02:45 crc kubenswrapper[4732]: I1010 09:02:45.793066 4732 generic.go:334] "Generic (PLEG): container finished" podID="84935670-75d6-471d-9b23-a55456e553f8" containerID="6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50" exitCode=0 Oct 10 09:02:45 crc kubenswrapper[4732]: I1010 09:02:45.793159 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvpcf" event={"ID":"84935670-75d6-471d-9b23-a55456e553f8","Type":"ContainerDied","Data":"6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50"} Oct 10 09:02:46 crc kubenswrapper[4732]: I1010 09:02:46.809367 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvpcf" event={"ID":"84935670-75d6-471d-9b23-a55456e553f8","Type":"ContainerStarted","Data":"1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8"} Oct 10 09:02:46 crc kubenswrapper[4732]: I1010 09:02:46.839058 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bvpcf" podStartSLOduration=2.257755494 podStartE2EDuration="4.839037945s" podCreationTimestamp="2025-10-10 09:02:42 +0000 UTC" firstStartedPulling="2025-10-10 09:02:43.771999939 +0000 UTC m=+7890.841591180" lastFinishedPulling="2025-10-10 09:02:46.35328239 +0000 UTC m=+7893.422873631" observedRunningTime="2025-10-10 09:02:46.832952679 +0000 UTC m=+7893.902543950" watchObservedRunningTime="2025-10-10 09:02:46.839037945 +0000 UTC m=+7893.908629206" Oct 10 09:02:49 crc kubenswrapper[4732]: I1010 09:02:49.660020 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:02:49 crc kubenswrapper[4732]: E1010 09:02:49.660898 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:02:52 crc kubenswrapper[4732]: I1010 09:02:52.906899 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:52 crc kubenswrapper[4732]: I1010 09:02:52.907453 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:52 crc kubenswrapper[4732]: I1010 09:02:52.958995 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:53 crc kubenswrapper[4732]: I1010 09:02:53.935809 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:53 crc kubenswrapper[4732]: I1010 09:02:53.986331 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvpcf"] Oct 10 09:02:55 crc kubenswrapper[4732]: I1010 09:02:55.898772 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bvpcf" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="registry-server" containerID="cri-o://1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8" gracePeriod=2 Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.389204 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.482215 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsddc\" (UniqueName: \"kubernetes.io/projected/84935670-75d6-471d-9b23-a55456e553f8-kube-api-access-jsddc\") pod \"84935670-75d6-471d-9b23-a55456e553f8\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.482344 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-catalog-content\") pod \"84935670-75d6-471d-9b23-a55456e553f8\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.482457 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-utilities\") pod \"84935670-75d6-471d-9b23-a55456e553f8\" (UID: \"84935670-75d6-471d-9b23-a55456e553f8\") " Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.483745 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-utilities" (OuterVolumeSpecName: "utilities") pod "84935670-75d6-471d-9b23-a55456e553f8" (UID: "84935670-75d6-471d-9b23-a55456e553f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.488902 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84935670-75d6-471d-9b23-a55456e553f8-kube-api-access-jsddc" (OuterVolumeSpecName: "kube-api-access-jsddc") pod "84935670-75d6-471d-9b23-a55456e553f8" (UID: "84935670-75d6-471d-9b23-a55456e553f8"). InnerVolumeSpecName "kube-api-access-jsddc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.496510 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84935670-75d6-471d-9b23-a55456e553f8" (UID: "84935670-75d6-471d-9b23-a55456e553f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.584438 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.584482 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84935670-75d6-471d-9b23-a55456e553f8-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.584496 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsddc\" (UniqueName: \"kubernetes.io/projected/84935670-75d6-471d-9b23-a55456e553f8-kube-api-access-jsddc\") on node \"crc\" DevicePath \"\"" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.912017 4732 generic.go:334] "Generic (PLEG): container finished" podID="84935670-75d6-471d-9b23-a55456e553f8" containerID="1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8" exitCode=0 Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.912064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvpcf" event={"ID":"84935670-75d6-471d-9b23-a55456e553f8","Type":"ContainerDied","Data":"1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8"} Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.912396 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvpcf" event={"ID":"84935670-75d6-471d-9b23-a55456e553f8","Type":"ContainerDied","Data":"0ba3fc22791ff72b1828dd85003acd8af800d8cf95f941483e29bb7be8e914cd"} Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.912419 4732 scope.go:117] "RemoveContainer" containerID="1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.912163 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvpcf" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.935743 4732 scope.go:117] "RemoveContainer" containerID="6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.958163 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvpcf"] Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.969488 4732 scope.go:117] "RemoveContainer" containerID="35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e" Oct 10 09:02:56 crc kubenswrapper[4732]: I1010 09:02:56.969893 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvpcf"] Oct 10 09:02:57 crc kubenswrapper[4732]: I1010 09:02:57.026220 4732 scope.go:117] "RemoveContainer" containerID="1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8" Oct 10 09:02:57 crc kubenswrapper[4732]: E1010 09:02:57.027104 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8\": container with ID starting with 1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8 not found: ID does not exist" containerID="1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8" Oct 10 09:02:57 crc kubenswrapper[4732]: I1010 09:02:57.027159 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8"} err="failed to get container status \"1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8\": rpc error: code = NotFound desc = could not find container \"1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8\": container with ID starting with 1374214198eecf9e46719b59d7a76e982fc5a53d6b15098e41bf291e303b2ca8 not found: ID does not exist" Oct 10 09:02:57 crc kubenswrapper[4732]: I1010 09:02:57.027194 4732 scope.go:117] "RemoveContainer" containerID="6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50" Oct 10 09:02:57 crc kubenswrapper[4732]: E1010 09:02:57.027733 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50\": container with ID starting with 6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50 not found: ID does not exist" containerID="6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50" Oct 10 09:02:57 crc kubenswrapper[4732]: I1010 09:02:57.027769 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50"} err="failed to get container status \"6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50\": rpc error: code = NotFound desc = could not find container \"6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50\": container with ID starting with 6a93a813351c1a8ccb99913a1565559c1560218112111c8a8729480e4a451d50 not found: ID does not exist" Oct 10 09:02:57 crc kubenswrapper[4732]: I1010 09:02:57.027846 4732 scope.go:117] "RemoveContainer" containerID="35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e" Oct 10 09:02:57 crc kubenswrapper[4732]: E1010 09:02:57.028247 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e\": container with ID starting with 35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e not found: ID does not exist" containerID="35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e" Oct 10 09:02:57 crc kubenswrapper[4732]: I1010 09:02:57.028269 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e"} err="failed to get container status \"35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e\": rpc error: code = NotFound desc = could not find container \"35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e\": container with ID starting with 35f9261a6e51ac9054982aeb4c5e38c62446fb0c3987e8f60d13133549aa037e not found: ID does not exist" Oct 10 09:02:57 crc kubenswrapper[4732]: I1010 09:02:57.681440 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84935670-75d6-471d-9b23-a55456e553f8" path="/var/lib/kubelet/pods/84935670-75d6-471d-9b23-a55456e553f8/volumes" Oct 10 09:03:00 crc kubenswrapper[4732]: I1010 09:03:00.660679 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:03:00 crc kubenswrapper[4732]: E1010 09:03:00.661288 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:03:13 crc kubenswrapper[4732]: I1010 09:03:13.681199 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:03:13 crc kubenswrapper[4732]: E1010 09:03:13.682656 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:03:28 crc kubenswrapper[4732]: I1010 09:03:28.660558 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:03:28 crc kubenswrapper[4732]: E1010 09:03:28.661365 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:03:42 crc kubenswrapper[4732]: I1010 09:03:42.661179 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:03:42 crc kubenswrapper[4732]: E1010 09:03:42.662345 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:03:53 crc kubenswrapper[4732]: I1010 09:03:53.668406 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:03:53 crc kubenswrapper[4732]: E1010 09:03:53.669353 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:04:08 crc kubenswrapper[4732]: I1010 09:04:08.660543 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:04:08 crc kubenswrapper[4732]: E1010 09:04:08.661991 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:04:20 crc kubenswrapper[4732]: I1010 09:04:20.660850 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:04:20 crc kubenswrapper[4732]: E1010 09:04:20.661985 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:04:31 crc kubenswrapper[4732]: I1010 09:04:31.660971 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:04:32 crc kubenswrapper[4732]: I1010 09:04:32.969547 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"449ced8eaafb77645ddf4550bf80964ce8c9de6970eb9be8dfb5c0ba7f5d8379"} Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.827462 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4qqqg"] Oct 10 09:05:01 crc kubenswrapper[4732]: E1010 09:05:01.828739 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="extract-content" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.828753 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="extract-content" Oct 10 09:05:01 crc kubenswrapper[4732]: E1010 09:05:01.828787 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="registry-server" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.828793 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="registry-server" Oct 10 09:05:01 crc kubenswrapper[4732]: E1010 09:05:01.828821 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="extract-utilities" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.828829 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="extract-utilities" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.829145 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="84935670-75d6-471d-9b23-a55456e553f8" containerName="registry-server" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.840231 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4qqqg"] Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.840575 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.992428 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmhb\" (UniqueName: \"kubernetes.io/projected/473ba42e-6745-40ef-b613-5065262041f6-kube-api-access-6hmhb\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.992529 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-utilities\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:01 crc kubenswrapper[4732]: I1010 09:05:01.992653 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-catalog-content\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.094347 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmhb\" (UniqueName: \"kubernetes.io/projected/473ba42e-6745-40ef-b613-5065262041f6-kube-api-access-6hmhb\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.094456 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-utilities\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.094528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-catalog-content\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.094988 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-utilities\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.095125 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-catalog-content\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.137633 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmhb\" (UniqueName: \"kubernetes.io/projected/473ba42e-6745-40ef-b613-5065262041f6-kube-api-access-6hmhb\") pod \"community-operators-4qqqg\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.177227 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:02 crc kubenswrapper[4732]: I1010 09:05:02.666739 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4qqqg"] Oct 10 09:05:03 crc kubenswrapper[4732]: I1010 09:05:03.276869 4732 generic.go:334] "Generic (PLEG): container finished" podID="473ba42e-6745-40ef-b613-5065262041f6" containerID="76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb" exitCode=0 Oct 10 09:05:03 crc kubenswrapper[4732]: I1010 09:05:03.276929 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qqqg" event={"ID":"473ba42e-6745-40ef-b613-5065262041f6","Type":"ContainerDied","Data":"76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb"} Oct 10 09:05:03 crc kubenswrapper[4732]: I1010 09:05:03.277090 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qqqg" event={"ID":"473ba42e-6745-40ef-b613-5065262041f6","Type":"ContainerStarted","Data":"9a1ebf3421373c7dc0221aca1ff577e45c85cd7206d9fd633fe94d488d99b2f7"} Oct 10 09:05:04 crc kubenswrapper[4732]: I1010 09:05:04.290037 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qqqg" event={"ID":"473ba42e-6745-40ef-b613-5065262041f6","Type":"ContainerStarted","Data":"bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f"} Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.149733 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pkbpv"] Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.151721 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.170938 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pkbpv"] Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.282763 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-utilities\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.282818 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-catalog-content\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.282997 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbfjb\" (UniqueName: \"kubernetes.io/projected/8264df3b-c19f-45d2-84ac-9dd063dbacdd-kube-api-access-qbfjb\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.309660 4732 generic.go:334] "Generic (PLEG): container finished" podID="473ba42e-6745-40ef-b613-5065262041f6" containerID="bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f" exitCode=0 Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.309735 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qqqg" event={"ID":"473ba42e-6745-40ef-b613-5065262041f6","Type":"ContainerDied","Data":"bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f"} Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.385371 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-catalog-content\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.385743 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbfjb\" (UniqueName: \"kubernetes.io/projected/8264df3b-c19f-45d2-84ac-9dd063dbacdd-kube-api-access-qbfjb\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.385880 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-utilities\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.386290 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-utilities\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.386503 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-catalog-content\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.405549 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbfjb\" (UniqueName: \"kubernetes.io/projected/8264df3b-c19f-45d2-84ac-9dd063dbacdd-kube-api-access-qbfjb\") pod \"certified-operators-pkbpv\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:06 crc kubenswrapper[4732]: I1010 09:05:06.490296 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:07 crc kubenswrapper[4732]: I1010 09:05:07.047657 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pkbpv"] Oct 10 09:05:07 crc kubenswrapper[4732]: I1010 09:05:07.323258 4732 generic.go:334] "Generic (PLEG): container finished" podID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerID="b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4" exitCode=0 Oct 10 09:05:07 crc kubenswrapper[4732]: I1010 09:05:07.323609 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkbpv" event={"ID":"8264df3b-c19f-45d2-84ac-9dd063dbacdd","Type":"ContainerDied","Data":"b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4"} Oct 10 09:05:07 crc kubenswrapper[4732]: I1010 09:05:07.323635 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkbpv" event={"ID":"8264df3b-c19f-45d2-84ac-9dd063dbacdd","Type":"ContainerStarted","Data":"1a28c9c0c90de432c33427298c0552e942586351c32adb6bfd0d952cc1980bb6"} Oct 10 09:05:07 crc kubenswrapper[4732]: I1010 09:05:07.328222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qqqg" event={"ID":"473ba42e-6745-40ef-b613-5065262041f6","Type":"ContainerStarted","Data":"8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd"} Oct 10 09:05:07 crc kubenswrapper[4732]: I1010 09:05:07.368839 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4qqqg" podStartSLOduration=2.880601533 podStartE2EDuration="6.368817346s" podCreationTimestamp="2025-10-10 09:05:01 +0000 UTC" firstStartedPulling="2025-10-10 09:05:03.278607261 +0000 UTC m=+8030.348198512" lastFinishedPulling="2025-10-10 09:05:06.766823084 +0000 UTC m=+8033.836414325" observedRunningTime="2025-10-10 09:05:07.365389243 +0000 UTC m=+8034.434980494" watchObservedRunningTime="2025-10-10 09:05:07.368817346 +0000 UTC m=+8034.438408607" Oct 10 09:05:08 crc kubenswrapper[4732]: I1010 09:05:08.344532 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkbpv" event={"ID":"8264df3b-c19f-45d2-84ac-9dd063dbacdd","Type":"ContainerStarted","Data":"03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28"} Oct 10 09:05:10 crc kubenswrapper[4732]: I1010 09:05:10.378076 4732 generic.go:334] "Generic (PLEG): container finished" podID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerID="03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28" exitCode=0 Oct 10 09:05:10 crc kubenswrapper[4732]: I1010 09:05:10.378350 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkbpv" event={"ID":"8264df3b-c19f-45d2-84ac-9dd063dbacdd","Type":"ContainerDied","Data":"03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28"} Oct 10 09:05:11 crc kubenswrapper[4732]: I1010 09:05:11.395332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkbpv" event={"ID":"8264df3b-c19f-45d2-84ac-9dd063dbacdd","Type":"ContainerStarted","Data":"ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87"} Oct 10 09:05:11 crc kubenswrapper[4732]: I1010 09:05:11.420838 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pkbpv" podStartSLOduration=1.814309825 podStartE2EDuration="5.42081682s" podCreationTimestamp="2025-10-10 09:05:06 +0000 UTC" firstStartedPulling="2025-10-10 09:05:07.326428181 +0000 UTC m=+8034.396019432" lastFinishedPulling="2025-10-10 09:05:10.932935196 +0000 UTC m=+8038.002526427" observedRunningTime="2025-10-10 09:05:11.416865642 +0000 UTC m=+8038.486456923" watchObservedRunningTime="2025-10-10 09:05:11.42081682 +0000 UTC m=+8038.490408061" Oct 10 09:05:12 crc kubenswrapper[4732]: I1010 09:05:12.177903 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:12 crc kubenswrapper[4732]: I1010 09:05:12.177966 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:13 crc kubenswrapper[4732]: I1010 09:05:13.255243 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4qqqg" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="registry-server" probeResult="failure" output=< Oct 10 09:05:13 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 09:05:13 crc kubenswrapper[4732]: > Oct 10 09:05:16 crc kubenswrapper[4732]: I1010 09:05:16.491072 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:16 crc kubenswrapper[4732]: I1010 09:05:16.491422 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:17 crc kubenswrapper[4732]: I1010 09:05:17.565300 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-pkbpv" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="registry-server" probeResult="failure" output=< Oct 10 09:05:17 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 09:05:17 crc kubenswrapper[4732]: > Oct 10 09:05:22 crc kubenswrapper[4732]: I1010 09:05:22.249523 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:22 crc kubenswrapper[4732]: I1010 09:05:22.325160 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:22 crc kubenswrapper[4732]: I1010 09:05:22.492038 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4qqqg"] Oct 10 09:05:23 crc kubenswrapper[4732]: I1010 09:05:23.526483 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4qqqg" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="registry-server" containerID="cri-o://8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd" gracePeriod=2 Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.132655 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.182127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hmhb\" (UniqueName: \"kubernetes.io/projected/473ba42e-6745-40ef-b613-5065262041f6-kube-api-access-6hmhb\") pod \"473ba42e-6745-40ef-b613-5065262041f6\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.182186 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-catalog-content\") pod \"473ba42e-6745-40ef-b613-5065262041f6\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.182261 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-utilities\") pod \"473ba42e-6745-40ef-b613-5065262041f6\" (UID: \"473ba42e-6745-40ef-b613-5065262041f6\") " Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.183634 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-utilities" (OuterVolumeSpecName: "utilities") pod "473ba42e-6745-40ef-b613-5065262041f6" (UID: "473ba42e-6745-40ef-b613-5065262041f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.190123 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473ba42e-6745-40ef-b613-5065262041f6-kube-api-access-6hmhb" (OuterVolumeSpecName: "kube-api-access-6hmhb") pod "473ba42e-6745-40ef-b613-5065262041f6" (UID: "473ba42e-6745-40ef-b613-5065262041f6"). InnerVolumeSpecName "kube-api-access-6hmhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.235058 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "473ba42e-6745-40ef-b613-5065262041f6" (UID: "473ba42e-6745-40ef-b613-5065262041f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.285016 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hmhb\" (UniqueName: \"kubernetes.io/projected/473ba42e-6745-40ef-b613-5065262041f6-kube-api-access-6hmhb\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.285047 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.285058 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/473ba42e-6745-40ef-b613-5065262041f6-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.541469 4732 generic.go:334] "Generic (PLEG): container finished" podID="473ba42e-6745-40ef-b613-5065262041f6" containerID="8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd" exitCode=0 Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.541532 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qqqg" event={"ID":"473ba42e-6745-40ef-b613-5065262041f6","Type":"ContainerDied","Data":"8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd"} Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.541595 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4qqqg" event={"ID":"473ba42e-6745-40ef-b613-5065262041f6","Type":"ContainerDied","Data":"9a1ebf3421373c7dc0221aca1ff577e45c85cd7206d9fd633fe94d488d99b2f7"} Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.541643 4732 scope.go:117] "RemoveContainer" containerID="8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.541934 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4qqqg" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.565213 4732 scope.go:117] "RemoveContainer" containerID="bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.622329 4732 scope.go:117] "RemoveContainer" containerID="76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.627026 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4qqqg"] Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.641792 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4qqqg"] Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.665744 4732 scope.go:117] "RemoveContainer" containerID="8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd" Oct 10 09:05:24 crc kubenswrapper[4732]: E1010 09:05:24.666461 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd\": container with ID starting with 8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd not found: ID does not exist" containerID="8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.666514 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd"} err="failed to get container status \"8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd\": rpc error: code = NotFound desc = could not find container \"8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd\": container with ID starting with 8070a86725b9b9ce41aca3754113e38fe480c60094ff2aed3d68309e69bba2dd not found: ID does not exist" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.666548 4732 scope.go:117] "RemoveContainer" containerID="bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f" Oct 10 09:05:24 crc kubenswrapper[4732]: E1010 09:05:24.667028 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f\": container with ID starting with bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f not found: ID does not exist" containerID="bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.667087 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f"} err="failed to get container status \"bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f\": rpc error: code = NotFound desc = could not find container \"bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f\": container with ID starting with bcd3727c458956ce8c5bf9f26d77effcf0c17a27cdb4b23ea887aab6ceac871f not found: ID does not exist" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.667122 4732 scope.go:117] "RemoveContainer" containerID="76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb" Oct 10 09:05:24 crc kubenswrapper[4732]: E1010 09:05:24.667533 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb\": container with ID starting with 76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb not found: ID does not exist" containerID="76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb" Oct 10 09:05:24 crc kubenswrapper[4732]: I1010 09:05:24.667571 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb"} err="failed to get container status \"76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb\": rpc error: code = NotFound desc = could not find container \"76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb\": container with ID starting with 76f3ad88a83c6407408b3cac2e3128b8e63c5145e15e361d77e96545c16142cb not found: ID does not exist" Oct 10 09:05:25 crc kubenswrapper[4732]: I1010 09:05:25.673028 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473ba42e-6745-40ef-b613-5065262041f6" path="/var/lib/kubelet/pods/473ba42e-6745-40ef-b613-5065262041f6/volumes" Oct 10 09:05:26 crc kubenswrapper[4732]: I1010 09:05:26.599961 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:26 crc kubenswrapper[4732]: I1010 09:05:26.678480 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:27 crc kubenswrapper[4732]: I1010 09:05:27.900679 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pkbpv"] Oct 10 09:05:28 crc kubenswrapper[4732]: I1010 09:05:28.601799 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pkbpv" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="registry-server" containerID="cri-o://ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87" gracePeriod=2 Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.194194 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.305508 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-catalog-content\") pod \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.305902 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbfjb\" (UniqueName: \"kubernetes.io/projected/8264df3b-c19f-45d2-84ac-9dd063dbacdd-kube-api-access-qbfjb\") pod \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.306017 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-utilities\") pod \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\" (UID: \"8264df3b-c19f-45d2-84ac-9dd063dbacdd\") " Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.307079 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-utilities" (OuterVolumeSpecName: "utilities") pod "8264df3b-c19f-45d2-84ac-9dd063dbacdd" (UID: "8264df3b-c19f-45d2-84ac-9dd063dbacdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.313285 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8264df3b-c19f-45d2-84ac-9dd063dbacdd-kube-api-access-qbfjb" (OuterVolumeSpecName: "kube-api-access-qbfjb") pod "8264df3b-c19f-45d2-84ac-9dd063dbacdd" (UID: "8264df3b-c19f-45d2-84ac-9dd063dbacdd"). InnerVolumeSpecName "kube-api-access-qbfjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.387138 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8264df3b-c19f-45d2-84ac-9dd063dbacdd" (UID: "8264df3b-c19f-45d2-84ac-9dd063dbacdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.408624 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbfjb\" (UniqueName: \"kubernetes.io/projected/8264df3b-c19f-45d2-84ac-9dd063dbacdd-kube-api-access-qbfjb\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.408658 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.408672 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8264df3b-c19f-45d2-84ac-9dd063dbacdd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.613934 4732 generic.go:334] "Generic (PLEG): container finished" podID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerID="ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87" exitCode=0 Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.614023 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkbpv" event={"ID":"8264df3b-c19f-45d2-84ac-9dd063dbacdd","Type":"ContainerDied","Data":"ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87"} Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.614067 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pkbpv" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.614167 4732 scope.go:117] "RemoveContainer" containerID="ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.614144 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pkbpv" event={"ID":"8264df3b-c19f-45d2-84ac-9dd063dbacdd","Type":"ContainerDied","Data":"1a28c9c0c90de432c33427298c0552e942586351c32adb6bfd0d952cc1980bb6"} Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.645043 4732 scope.go:117] "RemoveContainer" containerID="03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.655623 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pkbpv"] Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.682725 4732 scope.go:117] "RemoveContainer" containerID="b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.699466 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pkbpv"] Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.712317 4732 scope.go:117] "RemoveContainer" containerID="ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87" Oct 10 09:05:29 crc kubenswrapper[4732]: E1010 09:05:29.713182 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87\": container with ID starting with ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87 not found: ID does not exist" containerID="ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.713510 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87"} err="failed to get container status \"ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87\": rpc error: code = NotFound desc = could not find container \"ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87\": container with ID starting with ed6c33e30ee9165394fb7cfa5e40b943b7a30d0fe1a09aad661c535633deff87 not found: ID does not exist" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.714393 4732 scope.go:117] "RemoveContainer" containerID="03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28" Oct 10 09:05:29 crc kubenswrapper[4732]: E1010 09:05:29.715260 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28\": container with ID starting with 03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28 not found: ID does not exist" containerID="03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.715373 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28"} err="failed to get container status \"03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28\": rpc error: code = NotFound desc = could not find container \"03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28\": container with ID starting with 03cd09bbf1c6aebbaedfea0077bfefa62efb6413b52fd5b6fd96f1c777a60c28 not found: ID does not exist" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.715469 4732 scope.go:117] "RemoveContainer" containerID="b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4" Oct 10 09:05:29 crc kubenswrapper[4732]: E1010 09:05:29.716050 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4\": container with ID starting with b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4 not found: ID does not exist" containerID="b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4" Oct 10 09:05:29 crc kubenswrapper[4732]: I1010 09:05:29.716099 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4"} err="failed to get container status \"b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4\": rpc error: code = NotFound desc = could not find container \"b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4\": container with ID starting with b7cbd62e97eb7b7bcc0990d0d94a9ee52d2a1003ec9ee52bd096da4a0b41ead4 not found: ID does not exist" Oct 10 09:05:31 crc kubenswrapper[4732]: I1010 09:05:31.683653 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" path="/var/lib/kubelet/pods/8264df3b-c19f-45d2-84ac-9dd063dbacdd/volumes" Oct 10 09:05:49 crc kubenswrapper[4732]: I1010 09:05:49.856623 4732 generic.go:334] "Generic (PLEG): container finished" podID="dfd6910b-38c8-4a2c-9095-16c82b02e3e0" containerID="1b32fb6eae6e75e2a932724f1ab4240b3995da67cc64613661f4a5497cbf2907" exitCode=0 Oct 10 09:05:49 crc kubenswrapper[4732]: I1010 09:05:49.856822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" event={"ID":"dfd6910b-38c8-4a2c-9095-16c82b02e3e0","Type":"ContainerDied","Data":"1b32fb6eae6e75e2a932724f1ab4240b3995da67cc64613661f4a5497cbf2907"} Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.375911 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510371 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-combined-ca-bundle\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510470 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg4dq\" (UniqueName: \"kubernetes.io/projected/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-kube-api-access-fg4dq\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510547 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-0\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510663 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cells-global-config-0\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510729 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-1\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510769 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-ssh-key\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510908 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-1\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510963 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-inventory\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.510990 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-0\") pod \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\" (UID: \"dfd6910b-38c8-4a2c-9095-16c82b02e3e0\") " Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.518981 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.519057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-kube-api-access-fg4dq" (OuterVolumeSpecName: "kube-api-access-fg4dq") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "kube-api-access-fg4dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.543676 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.546723 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.552033 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.555546 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.558411 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.562863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-inventory" (OuterVolumeSpecName: "inventory") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.565459 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "dfd6910b-38c8-4a2c-9095-16c82b02e3e0" (UID: "dfd6910b-38c8-4a2c-9095-16c82b02e3e0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613056 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg4dq\" (UniqueName: \"kubernetes.io/projected/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-kube-api-access-fg4dq\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613086 4732 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613096 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613105 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613115 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613124 4732 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613133 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613142 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.613150 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfd6910b-38c8-4a2c-9095-16c82b02e3e0-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.876506 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" event={"ID":"dfd6910b-38c8-4a2c-9095-16c82b02e3e0","Type":"ContainerDied","Data":"c198447485038b1f164647e6c9ad1a5ad0d0861cdd0887b5996a2f6cd9ad4d8f"} Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.876808 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c198447485038b1f164647e6c9ad1a5ad0d0861cdd0887b5996a2f6cd9ad4d8f" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.876862 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7j7kl" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.979345 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-b7wxt"] Oct 10 09:05:51 crc kubenswrapper[4732]: E1010 09:05:51.980036 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="extract-utilities" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980067 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="extract-utilities" Oct 10 09:05:51 crc kubenswrapper[4732]: E1010 09:05:51.980109 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="registry-server" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980121 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="registry-server" Oct 10 09:05:51 crc kubenswrapper[4732]: E1010 09:05:51.980150 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd6910b-38c8-4a2c-9095-16c82b02e3e0" containerName="nova-cell1-openstack-openstack-cell1" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980160 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd6910b-38c8-4a2c-9095-16c82b02e3e0" containerName="nova-cell1-openstack-openstack-cell1" Oct 10 09:05:51 crc kubenswrapper[4732]: E1010 09:05:51.980171 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="registry-server" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980181 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="registry-server" Oct 10 09:05:51 crc kubenswrapper[4732]: E1010 09:05:51.980208 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="extract-content" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980218 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="extract-content" Oct 10 09:05:51 crc kubenswrapper[4732]: E1010 09:05:51.980232 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="extract-content" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980240 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="extract-content" Oct 10 09:05:51 crc kubenswrapper[4732]: E1010 09:05:51.980254 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="extract-utilities" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980262 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="extract-utilities" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980539 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8264df3b-c19f-45d2-84ac-9dd063dbacdd" containerName="registry-server" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980568 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd6910b-38c8-4a2c-9095-16c82b02e3e0" containerName="nova-cell1-openstack-openstack-cell1" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.980595 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="473ba42e-6745-40ef-b613-5065262041f6" containerName="registry-server" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.981812 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.984859 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.984923 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.985319 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.985453 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 09:05:51 crc kubenswrapper[4732]: I1010 09:05:51.985482 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.002520 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-b7wxt"] Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.025071 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75w2\" (UniqueName: \"kubernetes.io/projected/ac31623c-6f14-4647-b50e-22c1a6e37741-kube-api-access-l75w2\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.025402 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.025596 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ssh-key\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.025722 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.025844 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-inventory\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.025959 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.026075 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.127528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75w2\" (UniqueName: \"kubernetes.io/projected/ac31623c-6f14-4647-b50e-22c1a6e37741-kube-api-access-l75w2\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.127579 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.127639 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ssh-key\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.127673 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.127728 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-inventory\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.127774 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.127819 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.133338 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.133741 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.134123 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.134315 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ssh-key\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.135305 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-inventory\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.135480 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.145528 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75w2\" (UniqueName: \"kubernetes.io/projected/ac31623c-6f14-4647-b50e-22c1a6e37741-kube-api-access-l75w2\") pod \"telemetry-openstack-openstack-cell1-b7wxt\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.299293 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.850363 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-b7wxt"] Oct 10 09:05:52 crc kubenswrapper[4732]: I1010 09:05:52.908189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" event={"ID":"ac31623c-6f14-4647-b50e-22c1a6e37741","Type":"ContainerStarted","Data":"48a7aca44a6d70cbcd63014e96a9042e8d764cc3fdb2cbf5199a1d3ac21c94d2"} Oct 10 09:05:53 crc kubenswrapper[4732]: I1010 09:05:53.923905 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" event={"ID":"ac31623c-6f14-4647-b50e-22c1a6e37741","Type":"ContainerStarted","Data":"61d7564d769f6ee518fa55477e86e83ce9d8ae307001e1310749bfa3c7f938dd"} Oct 10 09:05:53 crc kubenswrapper[4732]: I1010 09:05:53.954072 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" podStartSLOduration=2.379211323 podStartE2EDuration="2.954052425s" podCreationTimestamp="2025-10-10 09:05:51 +0000 UTC" firstStartedPulling="2025-10-10 09:05:52.85539982 +0000 UTC m=+8079.924991061" lastFinishedPulling="2025-10-10 09:05:53.430240922 +0000 UTC m=+8080.499832163" observedRunningTime="2025-10-10 09:05:53.945607374 +0000 UTC m=+8081.015198645" watchObservedRunningTime="2025-10-10 09:05:53.954052425 +0000 UTC m=+8081.023643676" Oct 10 09:06:55 crc kubenswrapper[4732]: I1010 09:06:55.356302 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:06:55 crc kubenswrapper[4732]: I1010 09:06:55.356992 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:07:25 crc kubenswrapper[4732]: I1010 09:07:25.356282 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:07:25 crc kubenswrapper[4732]: I1010 09:07:25.357089 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:07:55 crc kubenswrapper[4732]: I1010 09:07:55.355611 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:07:55 crc kubenswrapper[4732]: I1010 09:07:55.356307 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:07:55 crc kubenswrapper[4732]: I1010 09:07:55.356361 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:07:55 crc kubenswrapper[4732]: I1010 09:07:55.357170 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"449ced8eaafb77645ddf4550bf80964ce8c9de6970eb9be8dfb5c0ba7f5d8379"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:07:55 crc kubenswrapper[4732]: I1010 09:07:55.357223 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://449ced8eaafb77645ddf4550bf80964ce8c9de6970eb9be8dfb5c0ba7f5d8379" gracePeriod=600 Oct 10 09:07:56 crc kubenswrapper[4732]: I1010 09:07:56.328094 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="449ced8eaafb77645ddf4550bf80964ce8c9de6970eb9be8dfb5c0ba7f5d8379" exitCode=0 Oct 10 09:07:56 crc kubenswrapper[4732]: I1010 09:07:56.328168 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"449ced8eaafb77645ddf4550bf80964ce8c9de6970eb9be8dfb5c0ba7f5d8379"} Oct 10 09:07:56 crc kubenswrapper[4732]: I1010 09:07:56.328676 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675"} Oct 10 09:07:56 crc kubenswrapper[4732]: I1010 09:07:56.328712 4732 scope.go:117] "RemoveContainer" containerID="52ba96aebefc82aca9573bffbe9885a4e40799b94d8686f553d993623c94d334" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.247617 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j9n55"] Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.250757 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.276077 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9n55"] Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.287546 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-596nb\" (UniqueName: \"kubernetes.io/projected/e6826277-4031-4051-bc9f-79cfc58f1d4f-kube-api-access-596nb\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.287741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-catalog-content\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.287775 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-utilities\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.388651 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-catalog-content\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.388727 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-utilities\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.388824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-596nb\" (UniqueName: \"kubernetes.io/projected/e6826277-4031-4051-bc9f-79cfc58f1d4f-kube-api-access-596nb\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.389336 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-catalog-content\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.389456 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-utilities\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.408915 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-596nb\" (UniqueName: \"kubernetes.io/projected/e6826277-4031-4051-bc9f-79cfc58f1d4f-kube-api-access-596nb\") pod \"redhat-operators-j9n55\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:20 crc kubenswrapper[4732]: I1010 09:09:20.581450 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:21 crc kubenswrapper[4732]: I1010 09:09:21.056332 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j9n55"] Oct 10 09:09:21 crc kubenswrapper[4732]: I1010 09:09:21.243365 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9n55" event={"ID":"e6826277-4031-4051-bc9f-79cfc58f1d4f","Type":"ContainerStarted","Data":"6c7504a61a2895572a0150220a801a86d8b1cb48cf08c265afd8e005f3348610"} Oct 10 09:09:22 crc kubenswrapper[4732]: I1010 09:09:22.255166 4732 generic.go:334] "Generic (PLEG): container finished" podID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerID="0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438" exitCode=0 Oct 10 09:09:22 crc kubenswrapper[4732]: I1010 09:09:22.255215 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9n55" event={"ID":"e6826277-4031-4051-bc9f-79cfc58f1d4f","Type":"ContainerDied","Data":"0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438"} Oct 10 09:09:22 crc kubenswrapper[4732]: I1010 09:09:22.257314 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:09:23 crc kubenswrapper[4732]: I1010 09:09:23.265519 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9n55" event={"ID":"e6826277-4031-4051-bc9f-79cfc58f1d4f","Type":"ContainerStarted","Data":"5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41"} Oct 10 09:09:27 crc kubenswrapper[4732]: I1010 09:09:27.318604 4732 generic.go:334] "Generic (PLEG): container finished" podID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerID="5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41" exitCode=0 Oct 10 09:09:27 crc kubenswrapper[4732]: I1010 09:09:27.318684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9n55" event={"ID":"e6826277-4031-4051-bc9f-79cfc58f1d4f","Type":"ContainerDied","Data":"5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41"} Oct 10 09:09:28 crc kubenswrapper[4732]: I1010 09:09:28.331194 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9n55" event={"ID":"e6826277-4031-4051-bc9f-79cfc58f1d4f","Type":"ContainerStarted","Data":"7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c"} Oct 10 09:09:28 crc kubenswrapper[4732]: I1010 09:09:28.358438 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j9n55" podStartSLOduration=2.820471939 podStartE2EDuration="8.358417429s" podCreationTimestamp="2025-10-10 09:09:20 +0000 UTC" firstStartedPulling="2025-10-10 09:09:22.256862202 +0000 UTC m=+8289.326453443" lastFinishedPulling="2025-10-10 09:09:27.794807682 +0000 UTC m=+8294.864398933" observedRunningTime="2025-10-10 09:09:28.352737414 +0000 UTC m=+8295.422328665" watchObservedRunningTime="2025-10-10 09:09:28.358417429 +0000 UTC m=+8295.428008670" Oct 10 09:09:30 crc kubenswrapper[4732]: I1010 09:09:30.581993 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:30 crc kubenswrapper[4732]: I1010 09:09:30.582031 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:31 crc kubenswrapper[4732]: I1010 09:09:31.631067 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j9n55" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="registry-server" probeResult="failure" output=< Oct 10 09:09:31 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 09:09:31 crc kubenswrapper[4732]: > Oct 10 09:09:40 crc kubenswrapper[4732]: I1010 09:09:40.674155 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:40 crc kubenswrapper[4732]: I1010 09:09:40.761829 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:40 crc kubenswrapper[4732]: I1010 09:09:40.938237 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j9n55"] Oct 10 09:09:42 crc kubenswrapper[4732]: I1010 09:09:42.491105 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j9n55" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="registry-server" containerID="cri-o://7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c" gracePeriod=2 Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.128038 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.294265 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-catalog-content\") pod \"e6826277-4031-4051-bc9f-79cfc58f1d4f\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.294356 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-utilities\") pod \"e6826277-4031-4051-bc9f-79cfc58f1d4f\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.294587 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-596nb\" (UniqueName: \"kubernetes.io/projected/e6826277-4031-4051-bc9f-79cfc58f1d4f-kube-api-access-596nb\") pod \"e6826277-4031-4051-bc9f-79cfc58f1d4f\" (UID: \"e6826277-4031-4051-bc9f-79cfc58f1d4f\") " Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.295104 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-utilities" (OuterVolumeSpecName: "utilities") pod "e6826277-4031-4051-bc9f-79cfc58f1d4f" (UID: "e6826277-4031-4051-bc9f-79cfc58f1d4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.295584 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.300553 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6826277-4031-4051-bc9f-79cfc58f1d4f-kube-api-access-596nb" (OuterVolumeSpecName: "kube-api-access-596nb") pod "e6826277-4031-4051-bc9f-79cfc58f1d4f" (UID: "e6826277-4031-4051-bc9f-79cfc58f1d4f"). InnerVolumeSpecName "kube-api-access-596nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.398478 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-596nb\" (UniqueName: \"kubernetes.io/projected/e6826277-4031-4051-bc9f-79cfc58f1d4f-kube-api-access-596nb\") on node \"crc\" DevicePath \"\"" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.400418 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6826277-4031-4051-bc9f-79cfc58f1d4f" (UID: "e6826277-4031-4051-bc9f-79cfc58f1d4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.501369 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6826277-4031-4051-bc9f-79cfc58f1d4f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.505386 4732 generic.go:334] "Generic (PLEG): container finished" podID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerID="7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c" exitCode=0 Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.505427 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9n55" event={"ID":"e6826277-4031-4051-bc9f-79cfc58f1d4f","Type":"ContainerDied","Data":"7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c"} Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.505455 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j9n55" event={"ID":"e6826277-4031-4051-bc9f-79cfc58f1d4f","Type":"ContainerDied","Data":"6c7504a61a2895572a0150220a801a86d8b1cb48cf08c265afd8e005f3348610"} Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.505463 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j9n55" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.505474 4732 scope.go:117] "RemoveContainer" containerID="7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.549065 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j9n55"] Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.554018 4732 scope.go:117] "RemoveContainer" containerID="5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.560525 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j9n55"] Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.590465 4732 scope.go:117] "RemoveContainer" containerID="0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.654298 4732 scope.go:117] "RemoveContainer" containerID="7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c" Oct 10 09:09:43 crc kubenswrapper[4732]: E1010 09:09:43.655240 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c\": container with ID starting with 7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c not found: ID does not exist" containerID="7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.655294 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c"} err="failed to get container status \"7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c\": rpc error: code = NotFound desc = could not find container \"7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c\": container with ID starting with 7e0d2e08a65e0d293da1b79de620969864d4cdb96e8b92afb1496f351b29538c not found: ID does not exist" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.655330 4732 scope.go:117] "RemoveContainer" containerID="5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41" Oct 10 09:09:43 crc kubenswrapper[4732]: E1010 09:09:43.656138 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41\": container with ID starting with 5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41 not found: ID does not exist" containerID="5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.656171 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41"} err="failed to get container status \"5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41\": rpc error: code = NotFound desc = could not find container \"5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41\": container with ID starting with 5b04998791f3177ea12d5a54bce7adda00648082e5a5771bed00e1405a0fac41 not found: ID does not exist" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.656191 4732 scope.go:117] "RemoveContainer" containerID="0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438" Oct 10 09:09:43 crc kubenswrapper[4732]: E1010 09:09:43.656638 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438\": container with ID starting with 0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438 not found: ID does not exist" containerID="0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.656674 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438"} err="failed to get container status \"0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438\": rpc error: code = NotFound desc = could not find container \"0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438\": container with ID starting with 0fa4f395bf971825b56e10917c5143d991d3eeefe7ca799a5646c635b3c00438 not found: ID does not exist" Oct 10 09:09:43 crc kubenswrapper[4732]: I1010 09:09:43.675560 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" path="/var/lib/kubelet/pods/e6826277-4031-4051-bc9f-79cfc58f1d4f/volumes" Oct 10 09:09:55 crc kubenswrapper[4732]: I1010 09:09:55.356562 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:09:55 crc kubenswrapper[4732]: I1010 09:09:55.357561 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:10:12 crc kubenswrapper[4732]: I1010 09:10:12.849318 4732 generic.go:334] "Generic (PLEG): container finished" podID="ac31623c-6f14-4647-b50e-22c1a6e37741" containerID="61d7564d769f6ee518fa55477e86e83ce9d8ae307001e1310749bfa3c7f938dd" exitCode=0 Oct 10 09:10:12 crc kubenswrapper[4732]: I1010 09:10:12.849445 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" event={"ID":"ac31623c-6f14-4647-b50e-22c1a6e37741","Type":"ContainerDied","Data":"61d7564d769f6ee518fa55477e86e83ce9d8ae307001e1310749bfa3c7f938dd"} Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.361564 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.427846 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-0\") pod \"ac31623c-6f14-4647-b50e-22c1a6e37741\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.428574 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-telemetry-combined-ca-bundle\") pod \"ac31623c-6f14-4647-b50e-22c1a6e37741\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.428608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ssh-key\") pod \"ac31623c-6f14-4647-b50e-22c1a6e37741\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.428678 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-inventory\") pod \"ac31623c-6f14-4647-b50e-22c1a6e37741\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.428738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-1\") pod \"ac31623c-6f14-4647-b50e-22c1a6e37741\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.428763 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-2\") pod \"ac31623c-6f14-4647-b50e-22c1a6e37741\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.428815 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l75w2\" (UniqueName: \"kubernetes.io/projected/ac31623c-6f14-4647-b50e-22c1a6e37741-kube-api-access-l75w2\") pod \"ac31623c-6f14-4647-b50e-22c1a6e37741\" (UID: \"ac31623c-6f14-4647-b50e-22c1a6e37741\") " Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.433310 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ac31623c-6f14-4647-b50e-22c1a6e37741" (UID: "ac31623c-6f14-4647-b50e-22c1a6e37741"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.435115 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac31623c-6f14-4647-b50e-22c1a6e37741-kube-api-access-l75w2" (OuterVolumeSpecName: "kube-api-access-l75w2") pod "ac31623c-6f14-4647-b50e-22c1a6e37741" (UID: "ac31623c-6f14-4647-b50e-22c1a6e37741"). InnerVolumeSpecName "kube-api-access-l75w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.463471 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-inventory" (OuterVolumeSpecName: "inventory") pod "ac31623c-6f14-4647-b50e-22c1a6e37741" (UID: "ac31623c-6f14-4647-b50e-22c1a6e37741"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.465235 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ac31623c-6f14-4647-b50e-22c1a6e37741" (UID: "ac31623c-6f14-4647-b50e-22c1a6e37741"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.478759 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ac31623c-6f14-4647-b50e-22c1a6e37741" (UID: "ac31623c-6f14-4647-b50e-22c1a6e37741"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.482573 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ac31623c-6f14-4647-b50e-22c1a6e37741" (UID: "ac31623c-6f14-4647-b50e-22c1a6e37741"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.486475 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ac31623c-6f14-4647-b50e-22c1a6e37741" (UID: "ac31623c-6f14-4647-b50e-22c1a6e37741"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.530672 4732 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.530725 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.530734 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.530742 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.530752 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.530762 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l75w2\" (UniqueName: \"kubernetes.io/projected/ac31623c-6f14-4647-b50e-22c1a6e37741-kube-api-access-l75w2\") on node \"crc\" DevicePath \"\"" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.530771 4732 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ac31623c-6f14-4647-b50e-22c1a6e37741-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.873201 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" event={"ID":"ac31623c-6f14-4647-b50e-22c1a6e37741","Type":"ContainerDied","Data":"48a7aca44a6d70cbcd63014e96a9042e8d764cc3fdb2cbf5199a1d3ac21c94d2"} Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.873266 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48a7aca44a6d70cbcd63014e96a9042e8d764cc3fdb2cbf5199a1d3ac21c94d2" Oct 10 09:10:14 crc kubenswrapper[4732]: I1010 09:10:14.873330 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-b7wxt" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.005208 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-g7zhx"] Oct 10 09:10:15 crc kubenswrapper[4732]: E1010 09:10:15.005640 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac31623c-6f14-4647-b50e-22c1a6e37741" containerName="telemetry-openstack-openstack-cell1" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.005657 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac31623c-6f14-4647-b50e-22c1a6e37741" containerName="telemetry-openstack-openstack-cell1" Oct 10 09:10:15 crc kubenswrapper[4732]: E1010 09:10:15.005669 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="registry-server" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.005677 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="registry-server" Oct 10 09:10:15 crc kubenswrapper[4732]: E1010 09:10:15.005718 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="extract-utilities" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.005726 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="extract-utilities" Oct 10 09:10:15 crc kubenswrapper[4732]: E1010 09:10:15.005741 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="extract-content" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.005747 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="extract-content" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.005944 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac31623c-6f14-4647-b50e-22c1a6e37741" containerName="telemetry-openstack-openstack-cell1" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.005959 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6826277-4031-4051-bc9f-79cfc58f1d4f" containerName="registry-server" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.006648 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.011262 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.011789 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.012110 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.012298 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.012561 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.024622 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-g7zhx"] Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.150880 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.150999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbrq\" (UniqueName: \"kubernetes.io/projected/9a0f923f-84b2-44b1-9030-636b08eff952-kube-api-access-qvbrq\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.151027 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.151111 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.151311 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.252763 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.252843 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbrq\" (UniqueName: \"kubernetes.io/projected/9a0f923f-84b2-44b1-9030-636b08eff952-kube-api-access-qvbrq\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.252873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.252997 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.253072 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.256616 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.258322 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.262254 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.267066 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.269607 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbrq\" (UniqueName: \"kubernetes.io/projected/9a0f923f-84b2-44b1-9030-636b08eff952-kube-api-access-qvbrq\") pod \"neutron-sriov-openstack-openstack-cell1-g7zhx\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.323305 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.873260 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-g7zhx"] Oct 10 09:10:15 crc kubenswrapper[4732]: I1010 09:10:15.884955 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" event={"ID":"9a0f923f-84b2-44b1-9030-636b08eff952","Type":"ContainerStarted","Data":"ae88722c84bed413d43dd22b00e5033eaaf3f37346a635835c7f7598089927ed"} Oct 10 09:10:16 crc kubenswrapper[4732]: I1010 09:10:16.897739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" event={"ID":"9a0f923f-84b2-44b1-9030-636b08eff952","Type":"ContainerStarted","Data":"007882563f064a2f0f9a64886124c1ef2491386038ed27e12c81b9d4105edeae"} Oct 10 09:10:16 crc kubenswrapper[4732]: I1010 09:10:16.923111 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" podStartSLOduration=2.263717746 podStartE2EDuration="2.923089402s" podCreationTimestamp="2025-10-10 09:10:14 +0000 UTC" firstStartedPulling="2025-10-10 09:10:15.874222763 +0000 UTC m=+8342.943814044" lastFinishedPulling="2025-10-10 09:10:16.533594419 +0000 UTC m=+8343.603185700" observedRunningTime="2025-10-10 09:10:16.920792709 +0000 UTC m=+8343.990383950" watchObservedRunningTime="2025-10-10 09:10:16.923089402 +0000 UTC m=+8343.992680653" Oct 10 09:10:25 crc kubenswrapper[4732]: I1010 09:10:25.356721 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:10:25 crc kubenswrapper[4732]: I1010 09:10:25.357063 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:10:55 crc kubenswrapper[4732]: I1010 09:10:55.356048 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:10:55 crc kubenswrapper[4732]: I1010 09:10:55.356793 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:10:55 crc kubenswrapper[4732]: I1010 09:10:55.356862 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:10:55 crc kubenswrapper[4732]: I1010 09:10:55.357912 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:10:55 crc kubenswrapper[4732]: I1010 09:10:55.358002 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" gracePeriod=600 Oct 10 09:10:55 crc kubenswrapper[4732]: E1010 09:10:55.484539 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:10:56 crc kubenswrapper[4732]: I1010 09:10:56.311949 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" exitCode=0 Oct 10 09:10:56 crc kubenswrapper[4732]: I1010 09:10:56.312009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675"} Oct 10 09:10:56 crc kubenswrapper[4732]: I1010 09:10:56.312269 4732 scope.go:117] "RemoveContainer" containerID="449ced8eaafb77645ddf4550bf80964ce8c9de6970eb9be8dfb5c0ba7f5d8379" Oct 10 09:10:56 crc kubenswrapper[4732]: I1010 09:10:56.312850 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:10:56 crc kubenswrapper[4732]: E1010 09:10:56.313105 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:11:11 crc kubenswrapper[4732]: I1010 09:11:11.660593 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:11:11 crc kubenswrapper[4732]: E1010 09:11:11.661521 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:11:24 crc kubenswrapper[4732]: I1010 09:11:24.661753 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:11:24 crc kubenswrapper[4732]: E1010 09:11:24.662678 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:11:39 crc kubenswrapper[4732]: I1010 09:11:39.660844 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:11:39 crc kubenswrapper[4732]: E1010 09:11:39.663044 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:11:52 crc kubenswrapper[4732]: I1010 09:11:52.660507 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:11:52 crc kubenswrapper[4732]: E1010 09:11:52.661897 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:12:04 crc kubenswrapper[4732]: I1010 09:12:04.661975 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:12:04 crc kubenswrapper[4732]: E1010 09:12:04.663305 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:12:19 crc kubenswrapper[4732]: I1010 09:12:19.660849 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:12:19 crc kubenswrapper[4732]: E1010 09:12:19.662194 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:12:33 crc kubenswrapper[4732]: I1010 09:12:33.673181 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:12:33 crc kubenswrapper[4732]: E1010 09:12:33.674993 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.497305 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67t6r"] Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.501798 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.523391 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67t6r"] Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.673265 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-catalog-content\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.673523 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-utilities\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.673581 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4dmw\" (UniqueName: \"kubernetes.io/projected/a963465b-6dc6-4944-8626-00bc7aea5796-kube-api-access-m4dmw\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.776586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-utilities\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.777185 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4dmw\" (UniqueName: \"kubernetes.io/projected/a963465b-6dc6-4944-8626-00bc7aea5796-kube-api-access-m4dmw\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.777109 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-utilities\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.777427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-catalog-content\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.778162 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-catalog-content\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.798400 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4dmw\" (UniqueName: \"kubernetes.io/projected/a963465b-6dc6-4944-8626-00bc7aea5796-kube-api-access-m4dmw\") pod \"redhat-marketplace-67t6r\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:46 crc kubenswrapper[4732]: I1010 09:12:46.850930 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:47 crc kubenswrapper[4732]: I1010 09:12:47.343863 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67t6r"] Oct 10 09:12:47 crc kubenswrapper[4732]: I1010 09:12:47.524650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67t6r" event={"ID":"a963465b-6dc6-4944-8626-00bc7aea5796","Type":"ContainerStarted","Data":"5d4c1388bd29d6a5ec039405ea648430b53403c7fdb9acc03e6a2be049ef79bf"} Oct 10 09:12:48 crc kubenswrapper[4732]: I1010 09:12:48.537387 4732 generic.go:334] "Generic (PLEG): container finished" podID="a963465b-6dc6-4944-8626-00bc7aea5796" containerID="dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682" exitCode=0 Oct 10 09:12:48 crc kubenswrapper[4732]: I1010 09:12:48.537456 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67t6r" event={"ID":"a963465b-6dc6-4944-8626-00bc7aea5796","Type":"ContainerDied","Data":"dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682"} Oct 10 09:12:48 crc kubenswrapper[4732]: I1010 09:12:48.660938 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:12:48 crc kubenswrapper[4732]: E1010 09:12:48.661391 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:12:49 crc kubenswrapper[4732]: I1010 09:12:49.551360 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67t6r" event={"ID":"a963465b-6dc6-4944-8626-00bc7aea5796","Type":"ContainerStarted","Data":"4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2"} Oct 10 09:12:50 crc kubenswrapper[4732]: I1010 09:12:50.566251 4732 generic.go:334] "Generic (PLEG): container finished" podID="a963465b-6dc6-4944-8626-00bc7aea5796" containerID="4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2" exitCode=0 Oct 10 09:12:50 crc kubenswrapper[4732]: I1010 09:12:50.566310 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67t6r" event={"ID":"a963465b-6dc6-4944-8626-00bc7aea5796","Type":"ContainerDied","Data":"4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2"} Oct 10 09:12:51 crc kubenswrapper[4732]: I1010 09:12:51.580252 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67t6r" event={"ID":"a963465b-6dc6-4944-8626-00bc7aea5796","Type":"ContainerStarted","Data":"ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a"} Oct 10 09:12:51 crc kubenswrapper[4732]: I1010 09:12:51.602237 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67t6r" podStartSLOduration=3.079913306 podStartE2EDuration="5.602213871s" podCreationTimestamp="2025-10-10 09:12:46 +0000 UTC" firstStartedPulling="2025-10-10 09:12:48.540794988 +0000 UTC m=+8495.610386239" lastFinishedPulling="2025-10-10 09:12:51.063095523 +0000 UTC m=+8498.132686804" observedRunningTime="2025-10-10 09:12:51.599163148 +0000 UTC m=+8498.668754389" watchObservedRunningTime="2025-10-10 09:12:51.602213871 +0000 UTC m=+8498.671805112" Oct 10 09:12:56 crc kubenswrapper[4732]: I1010 09:12:56.851540 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:56 crc kubenswrapper[4732]: I1010 09:12:56.852753 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:56 crc kubenswrapper[4732]: I1010 09:12:56.947343 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:57 crc kubenswrapper[4732]: I1010 09:12:57.755227 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:12:57 crc kubenswrapper[4732]: I1010 09:12:57.828250 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67t6r"] Oct 10 09:12:59 crc kubenswrapper[4732]: I1010 09:12:59.683122 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67t6r" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="registry-server" containerID="cri-o://ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a" gracePeriod=2 Oct 10 09:12:59 crc kubenswrapper[4732]: E1010 09:12:59.845005 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda963465b_6dc6_4944_8626_00bc7aea5796.slice/crio-conmon-ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a.scope\": RecentStats: unable to find data in memory cache]" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.168613 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.190383 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4dmw\" (UniqueName: \"kubernetes.io/projected/a963465b-6dc6-4944-8626-00bc7aea5796-kube-api-access-m4dmw\") pod \"a963465b-6dc6-4944-8626-00bc7aea5796\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.190593 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-utilities\") pod \"a963465b-6dc6-4944-8626-00bc7aea5796\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.190646 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-catalog-content\") pod \"a963465b-6dc6-4944-8626-00bc7aea5796\" (UID: \"a963465b-6dc6-4944-8626-00bc7aea5796\") " Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.191937 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-utilities" (OuterVolumeSpecName: "utilities") pod "a963465b-6dc6-4944-8626-00bc7aea5796" (UID: "a963465b-6dc6-4944-8626-00bc7aea5796"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.199078 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a963465b-6dc6-4944-8626-00bc7aea5796-kube-api-access-m4dmw" (OuterVolumeSpecName: "kube-api-access-m4dmw") pod "a963465b-6dc6-4944-8626-00bc7aea5796" (UID: "a963465b-6dc6-4944-8626-00bc7aea5796"). InnerVolumeSpecName "kube-api-access-m4dmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.207004 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a963465b-6dc6-4944-8626-00bc7aea5796" (UID: "a963465b-6dc6-4944-8626-00bc7aea5796"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.292794 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.292847 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a963465b-6dc6-4944-8626-00bc7aea5796-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.292861 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4dmw\" (UniqueName: \"kubernetes.io/projected/a963465b-6dc6-4944-8626-00bc7aea5796-kube-api-access-m4dmw\") on node \"crc\" DevicePath \"\"" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.699576 4732 generic.go:334] "Generic (PLEG): container finished" podID="a963465b-6dc6-4944-8626-00bc7aea5796" containerID="ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a" exitCode=0 Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.699639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67t6r" event={"ID":"a963465b-6dc6-4944-8626-00bc7aea5796","Type":"ContainerDied","Data":"ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a"} Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.700049 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67t6r" event={"ID":"a963465b-6dc6-4944-8626-00bc7aea5796","Type":"ContainerDied","Data":"5d4c1388bd29d6a5ec039405ea648430b53403c7fdb9acc03e6a2be049ef79bf"} Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.699664 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67t6r" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.700070 4732 scope.go:117] "RemoveContainer" containerID="ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.738964 4732 scope.go:117] "RemoveContainer" containerID="4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.761315 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67t6r"] Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.769278 4732 scope.go:117] "RemoveContainer" containerID="dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.772530 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67t6r"] Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.818314 4732 scope.go:117] "RemoveContainer" containerID="ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a" Oct 10 09:13:00 crc kubenswrapper[4732]: E1010 09:13:00.818846 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a\": container with ID starting with ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a not found: ID does not exist" containerID="ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.818890 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a"} err="failed to get container status \"ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a\": rpc error: code = NotFound desc = could not find container \"ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a\": container with ID starting with ade0db99ea1d672e059b95ea9dd3676a86d1b8730dbb20e9e533c6140ebea07a not found: ID does not exist" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.818915 4732 scope.go:117] "RemoveContainer" containerID="4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2" Oct 10 09:13:00 crc kubenswrapper[4732]: E1010 09:13:00.819506 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2\": container with ID starting with 4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2 not found: ID does not exist" containerID="4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.819548 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2"} err="failed to get container status \"4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2\": rpc error: code = NotFound desc = could not find container \"4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2\": container with ID starting with 4efaee4a3d32bf363960a86ca85553945ccf2a5900939f7e86a1d7a3ee27ffc2 not found: ID does not exist" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.819573 4732 scope.go:117] "RemoveContainer" containerID="dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682" Oct 10 09:13:00 crc kubenswrapper[4732]: E1010 09:13:00.819902 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682\": container with ID starting with dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682 not found: ID does not exist" containerID="dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682" Oct 10 09:13:00 crc kubenswrapper[4732]: I1010 09:13:00.819929 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682"} err="failed to get container status \"dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682\": rpc error: code = NotFound desc = could not find container \"dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682\": container with ID starting with dd865ba9fd5ea08e946619211d48fac25efedba5856e4871f0c511991ba48682 not found: ID does not exist" Oct 10 09:13:01 crc kubenswrapper[4732]: I1010 09:13:01.675290 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" path="/var/lib/kubelet/pods/a963465b-6dc6-4944-8626-00bc7aea5796/volumes" Oct 10 09:13:03 crc kubenswrapper[4732]: I1010 09:13:03.672644 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:13:03 crc kubenswrapper[4732]: E1010 09:13:03.673578 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:13:15 crc kubenswrapper[4732]: I1010 09:13:15.661555 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:13:15 crc kubenswrapper[4732]: E1010 09:13:15.663058 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:13:30 crc kubenswrapper[4732]: I1010 09:13:30.660445 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:13:30 crc kubenswrapper[4732]: E1010 09:13:30.661610 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:13:44 crc kubenswrapper[4732]: I1010 09:13:44.660457 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:13:44 crc kubenswrapper[4732]: E1010 09:13:44.661150 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:13:58 crc kubenswrapper[4732]: I1010 09:13:58.661325 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:13:58 crc kubenswrapper[4732]: E1010 09:13:58.663173 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:14:10 crc kubenswrapper[4732]: I1010 09:14:10.661375 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:14:10 crc kubenswrapper[4732]: E1010 09:14:10.662475 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:14:25 crc kubenswrapper[4732]: I1010 09:14:25.660556 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:14:25 crc kubenswrapper[4732]: E1010 09:14:25.662172 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:14:40 crc kubenswrapper[4732]: I1010 09:14:40.660856 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:14:40 crc kubenswrapper[4732]: E1010 09:14:40.662080 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:14:51 crc kubenswrapper[4732]: I1010 09:14:51.661546 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:14:51 crc kubenswrapper[4732]: E1010 09:14:51.662362 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.192256 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph"] Oct 10 09:15:00 crc kubenswrapper[4732]: E1010 09:15:00.193364 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="extract-content" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.193381 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="extract-content" Oct 10 09:15:00 crc kubenswrapper[4732]: E1010 09:15:00.193424 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="extract-utilities" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.193433 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="extract-utilities" Oct 10 09:15:00 crc kubenswrapper[4732]: E1010 09:15:00.193463 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="registry-server" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.193472 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="registry-server" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.193752 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a963465b-6dc6-4944-8626-00bc7aea5796" containerName="registry-server" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.194675 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.198056 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.198289 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.235834 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph"] Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.285111 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/963d2208-e617-4694-8b7f-a93fe3158a74-config-volume\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.285401 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwztv\" (UniqueName: \"kubernetes.io/projected/963d2208-e617-4694-8b7f-a93fe3158a74-kube-api-access-mwztv\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.285540 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/963d2208-e617-4694-8b7f-a93fe3158a74-secret-volume\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.387979 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/963d2208-e617-4694-8b7f-a93fe3158a74-config-volume\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.388068 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwztv\" (UniqueName: \"kubernetes.io/projected/963d2208-e617-4694-8b7f-a93fe3158a74-kube-api-access-mwztv\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.388123 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/963d2208-e617-4694-8b7f-a93fe3158a74-secret-volume\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.389968 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/963d2208-e617-4694-8b7f-a93fe3158a74-config-volume\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.397527 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/963d2208-e617-4694-8b7f-a93fe3158a74-secret-volume\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.416172 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwztv\" (UniqueName: \"kubernetes.io/projected/963d2208-e617-4694-8b7f-a93fe3158a74-kube-api-access-mwztv\") pod \"collect-profiles-29334795-dmtph\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:00 crc kubenswrapper[4732]: I1010 09:15:00.568532 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:01 crc kubenswrapper[4732]: I1010 09:15:01.060292 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph"] Oct 10 09:15:01 crc kubenswrapper[4732]: I1010 09:15:01.163604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" event={"ID":"963d2208-e617-4694-8b7f-a93fe3158a74","Type":"ContainerStarted","Data":"59db2d05738e1b6ddcced610e8a6f40d1d6aba154cc8a9b3ec69fde4e1e5f0d0"} Oct 10 09:15:02 crc kubenswrapper[4732]: I1010 09:15:02.175344 4732 generic.go:334] "Generic (PLEG): container finished" podID="963d2208-e617-4694-8b7f-a93fe3158a74" containerID="5cf5ba25c33c88d2b5ef0197b94c152872db5bb853af94dd365c72e663c6b44c" exitCode=0 Oct 10 09:15:02 crc kubenswrapper[4732]: I1010 09:15:02.175843 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" event={"ID":"963d2208-e617-4694-8b7f-a93fe3158a74","Type":"ContainerDied","Data":"5cf5ba25c33c88d2b5ef0197b94c152872db5bb853af94dd365c72e663c6b44c"} Oct 10 09:15:02 crc kubenswrapper[4732]: I1010 09:15:02.660890 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:15:02 crc kubenswrapper[4732]: E1010 09:15:02.661433 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.614731 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.671302 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwztv\" (UniqueName: \"kubernetes.io/projected/963d2208-e617-4694-8b7f-a93fe3158a74-kube-api-access-mwztv\") pod \"963d2208-e617-4694-8b7f-a93fe3158a74\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.671393 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/963d2208-e617-4694-8b7f-a93fe3158a74-secret-volume\") pod \"963d2208-e617-4694-8b7f-a93fe3158a74\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.696027 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963d2208-e617-4694-8b7f-a93fe3158a74-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "963d2208-e617-4694-8b7f-a93fe3158a74" (UID: "963d2208-e617-4694-8b7f-a93fe3158a74"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.696490 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963d2208-e617-4694-8b7f-a93fe3158a74-kube-api-access-mwztv" (OuterVolumeSpecName: "kube-api-access-mwztv") pod "963d2208-e617-4694-8b7f-a93fe3158a74" (UID: "963d2208-e617-4694-8b7f-a93fe3158a74"). InnerVolumeSpecName "kube-api-access-mwztv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.773515 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/963d2208-e617-4694-8b7f-a93fe3158a74-config-volume\") pod \"963d2208-e617-4694-8b7f-a93fe3158a74\" (UID: \"963d2208-e617-4694-8b7f-a93fe3158a74\") " Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.774098 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwztv\" (UniqueName: \"kubernetes.io/projected/963d2208-e617-4694-8b7f-a93fe3158a74-kube-api-access-mwztv\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.774121 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/963d2208-e617-4694-8b7f-a93fe3158a74-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.775709 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/963d2208-e617-4694-8b7f-a93fe3158a74-config-volume" (OuterVolumeSpecName: "config-volume") pod "963d2208-e617-4694-8b7f-a93fe3158a74" (UID: "963d2208-e617-4694-8b7f-a93fe3158a74"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:15:03 crc kubenswrapper[4732]: I1010 09:15:03.875531 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/963d2208-e617-4694-8b7f-a93fe3158a74-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:04 crc kubenswrapper[4732]: I1010 09:15:04.199224 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" event={"ID":"963d2208-e617-4694-8b7f-a93fe3158a74","Type":"ContainerDied","Data":"59db2d05738e1b6ddcced610e8a6f40d1d6aba154cc8a9b3ec69fde4e1e5f0d0"} Oct 10 09:15:04 crc kubenswrapper[4732]: I1010 09:15:04.199271 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59db2d05738e1b6ddcced610e8a6f40d1d6aba154cc8a9b3ec69fde4e1e5f0d0" Oct 10 09:15:04 crc kubenswrapper[4732]: I1010 09:15:04.199293 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334795-dmtph" Oct 10 09:15:04 crc kubenswrapper[4732]: I1010 09:15:04.689417 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z"] Oct 10 09:15:04 crc kubenswrapper[4732]: I1010 09:15:04.702043 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334750-9hz9z"] Oct 10 09:15:05 crc kubenswrapper[4732]: I1010 09:15:05.674356 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77b1c333-44ac-4cb6-bac5-598107d56e7b" path="/var/lib/kubelet/pods/77b1c333-44ac-4cb6-bac5-598107d56e7b/volumes" Oct 10 09:15:13 crc kubenswrapper[4732]: I1010 09:15:13.668957 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:15:13 crc kubenswrapper[4732]: E1010 09:15:13.669853 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:15:26 crc kubenswrapper[4732]: I1010 09:15:26.660318 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:15:26 crc kubenswrapper[4732]: E1010 09:15:26.660965 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:15:38 crc kubenswrapper[4732]: I1010 09:15:38.629279 4732 scope.go:117] "RemoveContainer" containerID="d2e37ba17e770666d7f7da5a1690c35585862e459b2fb04341396494a59302ee" Oct 10 09:15:39 crc kubenswrapper[4732]: I1010 09:15:39.661278 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:15:39 crc kubenswrapper[4732]: E1010 09:15:39.663001 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.033894 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cb8l4"] Oct 10 09:15:44 crc kubenswrapper[4732]: E1010 09:15:44.034890 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963d2208-e617-4694-8b7f-a93fe3158a74" containerName="collect-profiles" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.034903 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="963d2208-e617-4694-8b7f-a93fe3158a74" containerName="collect-profiles" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.035107 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="963d2208-e617-4694-8b7f-a93fe3158a74" containerName="collect-profiles" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.036526 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.050184 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cb8l4"] Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.149993 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-utilities\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.150088 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-catalog-content\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.150173 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr5d5\" (UniqueName: \"kubernetes.io/projected/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-kube-api-access-cr5d5\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.251587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-utilities\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.251668 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-catalog-content\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.251769 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr5d5\" (UniqueName: \"kubernetes.io/projected/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-kube-api-access-cr5d5\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.252180 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-utilities\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.252376 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-catalog-content\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.282397 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr5d5\" (UniqueName: \"kubernetes.io/projected/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-kube-api-access-cr5d5\") pod \"certified-operators-cb8l4\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.355677 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:44 crc kubenswrapper[4732]: I1010 09:15:44.877445 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cb8l4"] Oct 10 09:15:45 crc kubenswrapper[4732]: I1010 09:15:45.696910 4732 generic.go:334] "Generic (PLEG): container finished" podID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerID="2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5" exitCode=0 Oct 10 09:15:45 crc kubenswrapper[4732]: I1010 09:15:45.696988 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb8l4" event={"ID":"12a73edf-8b1b-4b0a-8ca2-dce3fb775932","Type":"ContainerDied","Data":"2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5"} Oct 10 09:15:45 crc kubenswrapper[4732]: I1010 09:15:45.698290 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb8l4" event={"ID":"12a73edf-8b1b-4b0a-8ca2-dce3fb775932","Type":"ContainerStarted","Data":"ff25b6da189ec71a569e1c26046cddfb78ed393214f42183e83aa546cef51ce4"} Oct 10 09:15:45 crc kubenswrapper[4732]: I1010 09:15:45.699913 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.433814 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ts66s"] Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.435843 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.449405 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ts66s"] Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.496858 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-utilities\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.496943 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-catalog-content\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.497024 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg69n\" (UniqueName: \"kubernetes.io/projected/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-kube-api-access-tg69n\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.598566 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-utilities\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.599392 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-catalog-content\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.599510 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg69n\" (UniqueName: \"kubernetes.io/projected/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-kube-api-access-tg69n\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.599229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-utilities\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.600240 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-catalog-content\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.621992 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg69n\" (UniqueName: \"kubernetes.io/projected/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-kube-api-access-tg69n\") pod \"community-operators-ts66s\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:46 crc kubenswrapper[4732]: I1010 09:15:46.801947 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:47 crc kubenswrapper[4732]: I1010 09:15:47.310537 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ts66s"] Oct 10 09:15:47 crc kubenswrapper[4732]: I1010 09:15:47.715914 4732 generic.go:334] "Generic (PLEG): container finished" podID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerID="ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef" exitCode=0 Oct 10 09:15:47 crc kubenswrapper[4732]: I1010 09:15:47.715958 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ts66s" event={"ID":"4bfe81ad-000c-42dc-b55b-b1c8e34ade96","Type":"ContainerDied","Data":"ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef"} Oct 10 09:15:47 crc kubenswrapper[4732]: I1010 09:15:47.716271 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ts66s" event={"ID":"4bfe81ad-000c-42dc-b55b-b1c8e34ade96","Type":"ContainerStarted","Data":"428986dc834a82f77f6537519be86e7de549db0b5f6db2e79b07fc14aef62b29"} Oct 10 09:15:47 crc kubenswrapper[4732]: I1010 09:15:47.718296 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb8l4" event={"ID":"12a73edf-8b1b-4b0a-8ca2-dce3fb775932","Type":"ContainerStarted","Data":"c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5"} Oct 10 09:15:48 crc kubenswrapper[4732]: I1010 09:15:48.730351 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ts66s" event={"ID":"4bfe81ad-000c-42dc-b55b-b1c8e34ade96","Type":"ContainerStarted","Data":"4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90"} Oct 10 09:15:48 crc kubenswrapper[4732]: I1010 09:15:48.733526 4732 generic.go:334] "Generic (PLEG): container finished" podID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerID="c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5" exitCode=0 Oct 10 09:15:48 crc kubenswrapper[4732]: I1010 09:15:48.733628 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb8l4" event={"ID":"12a73edf-8b1b-4b0a-8ca2-dce3fb775932","Type":"ContainerDied","Data":"c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5"} Oct 10 09:15:49 crc kubenswrapper[4732]: I1010 09:15:49.745010 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb8l4" event={"ID":"12a73edf-8b1b-4b0a-8ca2-dce3fb775932","Type":"ContainerStarted","Data":"12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd"} Oct 10 09:15:49 crc kubenswrapper[4732]: I1010 09:15:49.774679 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cb8l4" podStartSLOduration=2.193112949 podStartE2EDuration="5.774658644s" podCreationTimestamp="2025-10-10 09:15:44 +0000 UTC" firstStartedPulling="2025-10-10 09:15:45.699641063 +0000 UTC m=+8672.769232304" lastFinishedPulling="2025-10-10 09:15:49.281186758 +0000 UTC m=+8676.350777999" observedRunningTime="2025-10-10 09:15:49.766173993 +0000 UTC m=+8676.835765234" watchObservedRunningTime="2025-10-10 09:15:49.774658644 +0000 UTC m=+8676.844249885" Oct 10 09:15:50 crc kubenswrapper[4732]: I1010 09:15:50.757080 4732 generic.go:334] "Generic (PLEG): container finished" podID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerID="4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90" exitCode=0 Oct 10 09:15:50 crc kubenswrapper[4732]: I1010 09:15:50.757146 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ts66s" event={"ID":"4bfe81ad-000c-42dc-b55b-b1c8e34ade96","Type":"ContainerDied","Data":"4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90"} Oct 10 09:15:51 crc kubenswrapper[4732]: I1010 09:15:51.769466 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ts66s" event={"ID":"4bfe81ad-000c-42dc-b55b-b1c8e34ade96","Type":"ContainerStarted","Data":"41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e"} Oct 10 09:15:51 crc kubenswrapper[4732]: I1010 09:15:51.790306 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ts66s" podStartSLOduration=2.0642287599999998 podStartE2EDuration="5.790281882s" podCreationTimestamp="2025-10-10 09:15:46 +0000 UTC" firstStartedPulling="2025-10-10 09:15:47.717984326 +0000 UTC m=+8674.787575567" lastFinishedPulling="2025-10-10 09:15:51.444037438 +0000 UTC m=+8678.513628689" observedRunningTime="2025-10-10 09:15:51.786627433 +0000 UTC m=+8678.856218674" watchObservedRunningTime="2025-10-10 09:15:51.790281882 +0000 UTC m=+8678.859873123" Oct 10 09:15:54 crc kubenswrapper[4732]: I1010 09:15:54.356722 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:54 crc kubenswrapper[4732]: I1010 09:15:54.357278 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:15:54 crc kubenswrapper[4732]: E1010 09:15:54.636668 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a0f923f_84b2_44b1_9030_636b08eff952.slice/crio-conmon-007882563f064a2f0f9a64886124c1ef2491386038ed27e12c81b9d4105edeae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a0f923f_84b2_44b1_9030_636b08eff952.slice/crio-007882563f064a2f0f9a64886124c1ef2491386038ed27e12c81b9d4105edeae.scope\": RecentStats: unable to find data in memory cache]" Oct 10 09:15:54 crc kubenswrapper[4732]: I1010 09:15:54.660487 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:15:54 crc kubenswrapper[4732]: E1010 09:15:54.660765 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:15:54 crc kubenswrapper[4732]: I1010 09:15:54.797583 4732 generic.go:334] "Generic (PLEG): container finished" podID="9a0f923f-84b2-44b1-9030-636b08eff952" containerID="007882563f064a2f0f9a64886124c1ef2491386038ed27e12c81b9d4105edeae" exitCode=0 Oct 10 09:15:54 crc kubenswrapper[4732]: I1010 09:15:54.797650 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" event={"ID":"9a0f923f-84b2-44b1-9030-636b08eff952","Type":"ContainerDied","Data":"007882563f064a2f0f9a64886124c1ef2491386038ed27e12c81b9d4105edeae"} Oct 10 09:15:55 crc kubenswrapper[4732]: I1010 09:15:55.415424 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cb8l4" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="registry-server" probeResult="failure" output=< Oct 10 09:15:55 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 09:15:55 crc kubenswrapper[4732]: > Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.214588 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.288927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvbrq\" (UniqueName: \"kubernetes.io/projected/9a0f923f-84b2-44b1-9030-636b08eff952-kube-api-access-qvbrq\") pod \"9a0f923f-84b2-44b1-9030-636b08eff952\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.288977 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-combined-ca-bundle\") pod \"9a0f923f-84b2-44b1-9030-636b08eff952\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.289034 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-ssh-key\") pod \"9a0f923f-84b2-44b1-9030-636b08eff952\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.289107 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-agent-neutron-config-0\") pod \"9a0f923f-84b2-44b1-9030-636b08eff952\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.289222 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-inventory\") pod \"9a0f923f-84b2-44b1-9030-636b08eff952\" (UID: \"9a0f923f-84b2-44b1-9030-636b08eff952\") " Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.296163 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9a0f923f-84b2-44b1-9030-636b08eff952" (UID: "9a0f923f-84b2-44b1-9030-636b08eff952"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.303605 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0f923f-84b2-44b1-9030-636b08eff952-kube-api-access-qvbrq" (OuterVolumeSpecName: "kube-api-access-qvbrq") pod "9a0f923f-84b2-44b1-9030-636b08eff952" (UID: "9a0f923f-84b2-44b1-9030-636b08eff952"). InnerVolumeSpecName "kube-api-access-qvbrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.317463 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9a0f923f-84b2-44b1-9030-636b08eff952" (UID: "9a0f923f-84b2-44b1-9030-636b08eff952"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.317538 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-inventory" (OuterVolumeSpecName: "inventory") pod "9a0f923f-84b2-44b1-9030-636b08eff952" (UID: "9a0f923f-84b2-44b1-9030-636b08eff952"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.318341 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "9a0f923f-84b2-44b1-9030-636b08eff952" (UID: "9a0f923f-84b2-44b1-9030-636b08eff952"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.391622 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvbrq\" (UniqueName: \"kubernetes.io/projected/9a0f923f-84b2-44b1-9030-636b08eff952-kube-api-access-qvbrq\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.391657 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.391671 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.391682 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.391709 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a0f923f-84b2-44b1-9030-636b08eff952-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.802910 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.802963 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.817615 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" event={"ID":"9a0f923f-84b2-44b1-9030-636b08eff952","Type":"ContainerDied","Data":"ae88722c84bed413d43dd22b00e5033eaaf3f37346a635835c7f7598089927ed"} Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.817650 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae88722c84bed413d43dd22b00e5033eaaf3f37346a635835c7f7598089927ed" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.817761 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-g7zhx" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.897865 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.961928 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf"] Oct 10 09:15:56 crc kubenswrapper[4732]: E1010 09:15:56.962400 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0f923f-84b2-44b1-9030-636b08eff952" containerName="neutron-sriov-openstack-openstack-cell1" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.962420 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0f923f-84b2-44b1-9030-636b08eff952" containerName="neutron-sriov-openstack-openstack-cell1" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.962724 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0f923f-84b2-44b1-9030-636b08eff952" containerName="neutron-sriov-openstack-openstack-cell1" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.963968 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.966966 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.967202 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.967380 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.967395 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.967407 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.978669 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf"] Oct 10 09:15:56 crc kubenswrapper[4732]: I1010 09:15:56.987077 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.003272 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.003325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.004653 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.004805 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4bjk\" (UniqueName: \"kubernetes.io/projected/6dd18d28-5db6-4fff-96a8-320afc9e7638-kube-api-access-x4bjk\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.005009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.107117 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.107226 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.107261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.107293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.107330 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4bjk\" (UniqueName: \"kubernetes.io/projected/6dd18d28-5db6-4fff-96a8-320afc9e7638-kube-api-access-x4bjk\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.111297 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.111583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.112482 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.117391 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.131403 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4bjk\" (UniqueName: \"kubernetes.io/projected/6dd18d28-5db6-4fff-96a8-320afc9e7638-kube-api-access-x4bjk\") pod \"neutron-dhcp-openstack-openstack-cell1-l5xlf\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.150897 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ts66s"] Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.288779 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:15:57 crc kubenswrapper[4732]: I1010 09:15:57.909126 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf"] Oct 10 09:15:58 crc kubenswrapper[4732]: I1010 09:15:58.837452 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" event={"ID":"6dd18d28-5db6-4fff-96a8-320afc9e7638","Type":"ContainerStarted","Data":"1d874af493262bdf7e20e8ce15311ad6846fbe5998e86dff635175e83f7bfdad"} Oct 10 09:15:58 crc kubenswrapper[4732]: I1010 09:15:58.838024 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" event={"ID":"6dd18d28-5db6-4fff-96a8-320afc9e7638","Type":"ContainerStarted","Data":"8e3f26ba7c0ccf30802e77a334c392c88be0d763da15650b657fe493d66dd77c"} Oct 10 09:15:58 crc kubenswrapper[4732]: I1010 09:15:58.837529 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ts66s" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="registry-server" containerID="cri-o://41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e" gracePeriod=2 Oct 10 09:15:58 crc kubenswrapper[4732]: I1010 09:15:58.866344 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" podStartSLOduration=2.371678463 podStartE2EDuration="2.86630021s" podCreationTimestamp="2025-10-10 09:15:56 +0000 UTC" firstStartedPulling="2025-10-10 09:15:57.917842687 +0000 UTC m=+8684.987433918" lastFinishedPulling="2025-10-10 09:15:58.412464384 +0000 UTC m=+8685.482055665" observedRunningTime="2025-10-10 09:15:58.865032945 +0000 UTC m=+8685.934624206" watchObservedRunningTime="2025-10-10 09:15:58.86630021 +0000 UTC m=+8685.935891471" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.329877 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.460464 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg69n\" (UniqueName: \"kubernetes.io/projected/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-kube-api-access-tg69n\") pod \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.460580 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-utilities\") pod \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.460611 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-catalog-content\") pod \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\" (UID: \"4bfe81ad-000c-42dc-b55b-b1c8e34ade96\") " Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.461530 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-utilities" (OuterVolumeSpecName: "utilities") pod "4bfe81ad-000c-42dc-b55b-b1c8e34ade96" (UID: "4bfe81ad-000c-42dc-b55b-b1c8e34ade96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.465528 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-kube-api-access-tg69n" (OuterVolumeSpecName: "kube-api-access-tg69n") pod "4bfe81ad-000c-42dc-b55b-b1c8e34ade96" (UID: "4bfe81ad-000c-42dc-b55b-b1c8e34ade96"). InnerVolumeSpecName "kube-api-access-tg69n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.512551 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bfe81ad-000c-42dc-b55b-b1c8e34ade96" (UID: "4bfe81ad-000c-42dc-b55b-b1c8e34ade96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.563323 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.563364 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.563374 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg69n\" (UniqueName: \"kubernetes.io/projected/4bfe81ad-000c-42dc-b55b-b1c8e34ade96-kube-api-access-tg69n\") on node \"crc\" DevicePath \"\"" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.849379 4732 generic.go:334] "Generic (PLEG): container finished" podID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerID="41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e" exitCode=0 Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.849442 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ts66s" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.849442 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ts66s" event={"ID":"4bfe81ad-000c-42dc-b55b-b1c8e34ade96","Type":"ContainerDied","Data":"41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e"} Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.849495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ts66s" event={"ID":"4bfe81ad-000c-42dc-b55b-b1c8e34ade96","Type":"ContainerDied","Data":"428986dc834a82f77f6537519be86e7de549db0b5f6db2e79b07fc14aef62b29"} Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.849518 4732 scope.go:117] "RemoveContainer" containerID="41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.873570 4732 scope.go:117] "RemoveContainer" containerID="4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.875144 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ts66s"] Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.886068 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ts66s"] Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.913614 4732 scope.go:117] "RemoveContainer" containerID="ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.957109 4732 scope.go:117] "RemoveContainer" containerID="41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e" Oct 10 09:15:59 crc kubenswrapper[4732]: E1010 09:15:59.957877 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e\": container with ID starting with 41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e not found: ID does not exist" containerID="41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.957906 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e"} err="failed to get container status \"41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e\": rpc error: code = NotFound desc = could not find container \"41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e\": container with ID starting with 41c37a96e9261b7b5b13e425af411d2be4ea88dd12ec089b3f26ee5cd1a5597e not found: ID does not exist" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.957928 4732 scope.go:117] "RemoveContainer" containerID="4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90" Oct 10 09:15:59 crc kubenswrapper[4732]: E1010 09:15:59.958270 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90\": container with ID starting with 4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90 not found: ID does not exist" containerID="4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.958295 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90"} err="failed to get container status \"4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90\": rpc error: code = NotFound desc = could not find container \"4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90\": container with ID starting with 4e5feef863e18cd87367ea7792b3ad7cdb06a3a113b9fad6f85ee9657bfe6c90 not found: ID does not exist" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.958309 4732 scope.go:117] "RemoveContainer" containerID="ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef" Oct 10 09:15:59 crc kubenswrapper[4732]: E1010 09:15:59.958549 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef\": container with ID starting with ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef not found: ID does not exist" containerID="ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef" Oct 10 09:15:59 crc kubenswrapper[4732]: I1010 09:15:59.958573 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef"} err="failed to get container status \"ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef\": rpc error: code = NotFound desc = could not find container \"ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef\": container with ID starting with ac1f89517963f98e5c86a7fc6e4f0aa8d3089323b600f2ee755317b9f48239ef not found: ID does not exist" Oct 10 09:16:01 crc kubenswrapper[4732]: I1010 09:16:01.672438 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" path="/var/lib/kubelet/pods/4bfe81ad-000c-42dc-b55b-b1c8e34ade96/volumes" Oct 10 09:16:04 crc kubenswrapper[4732]: I1010 09:16:04.402634 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:16:04 crc kubenswrapper[4732]: I1010 09:16:04.459419 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:16:04 crc kubenswrapper[4732]: I1010 09:16:04.656825 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cb8l4"] Oct 10 09:16:05 crc kubenswrapper[4732]: I1010 09:16:05.905087 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cb8l4" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="registry-server" containerID="cri-o://12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd" gracePeriod=2 Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.440513 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.636359 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr5d5\" (UniqueName: \"kubernetes.io/projected/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-kube-api-access-cr5d5\") pod \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.636526 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-catalog-content\") pod \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.636680 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-utilities\") pod \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\" (UID: \"12a73edf-8b1b-4b0a-8ca2-dce3fb775932\") " Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.637752 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-utilities" (OuterVolumeSpecName: "utilities") pod "12a73edf-8b1b-4b0a-8ca2-dce3fb775932" (UID: "12a73edf-8b1b-4b0a-8ca2-dce3fb775932"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.642862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-kube-api-access-cr5d5" (OuterVolumeSpecName: "kube-api-access-cr5d5") pod "12a73edf-8b1b-4b0a-8ca2-dce3fb775932" (UID: "12a73edf-8b1b-4b0a-8ca2-dce3fb775932"). InnerVolumeSpecName "kube-api-access-cr5d5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.678456 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12a73edf-8b1b-4b0a-8ca2-dce3fb775932" (UID: "12a73edf-8b1b-4b0a-8ca2-dce3fb775932"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.741183 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr5d5\" (UniqueName: \"kubernetes.io/projected/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-kube-api-access-cr5d5\") on node \"crc\" DevicePath \"\"" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.741222 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.741232 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12a73edf-8b1b-4b0a-8ca2-dce3fb775932-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.915639 4732 generic.go:334] "Generic (PLEG): container finished" podID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerID="12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd" exitCode=0 Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.915718 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb8l4" event={"ID":"12a73edf-8b1b-4b0a-8ca2-dce3fb775932","Type":"ContainerDied","Data":"12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd"} Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.915782 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cb8l4" event={"ID":"12a73edf-8b1b-4b0a-8ca2-dce3fb775932","Type":"ContainerDied","Data":"ff25b6da189ec71a569e1c26046cddfb78ed393214f42183e83aa546cef51ce4"} Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.915820 4732 scope.go:117] "RemoveContainer" containerID="12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.915831 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cb8l4" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.947436 4732 scope.go:117] "RemoveContainer" containerID="c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5" Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.953482 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cb8l4"] Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.968308 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cb8l4"] Oct 10 09:16:06 crc kubenswrapper[4732]: I1010 09:16:06.990054 4732 scope.go:117] "RemoveContainer" containerID="2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5" Oct 10 09:16:07 crc kubenswrapper[4732]: I1010 09:16:07.021233 4732 scope.go:117] "RemoveContainer" containerID="12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd" Oct 10 09:16:07 crc kubenswrapper[4732]: E1010 09:16:07.021638 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd\": container with ID starting with 12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd not found: ID does not exist" containerID="12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd" Oct 10 09:16:07 crc kubenswrapper[4732]: I1010 09:16:07.021667 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd"} err="failed to get container status \"12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd\": rpc error: code = NotFound desc = could not find container \"12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd\": container with ID starting with 12060c2cf5623362ffe2be36ab4d52e2bfce5139930b09f7ca10004ccda7cdcd not found: ID does not exist" Oct 10 09:16:07 crc kubenswrapper[4732]: I1010 09:16:07.021702 4732 scope.go:117] "RemoveContainer" containerID="c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5" Oct 10 09:16:07 crc kubenswrapper[4732]: E1010 09:16:07.022562 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5\": container with ID starting with c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5 not found: ID does not exist" containerID="c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5" Oct 10 09:16:07 crc kubenswrapper[4732]: I1010 09:16:07.022589 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5"} err="failed to get container status \"c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5\": rpc error: code = NotFound desc = could not find container \"c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5\": container with ID starting with c6d7e1a4fbe807db17eacb4573f71e69788553e96808f1f39ba071b9a1a825e5 not found: ID does not exist" Oct 10 09:16:07 crc kubenswrapper[4732]: I1010 09:16:07.022609 4732 scope.go:117] "RemoveContainer" containerID="2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5" Oct 10 09:16:07 crc kubenswrapper[4732]: E1010 09:16:07.022970 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5\": container with ID starting with 2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5 not found: ID does not exist" containerID="2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5" Oct 10 09:16:07 crc kubenswrapper[4732]: I1010 09:16:07.023150 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5"} err="failed to get container status \"2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5\": rpc error: code = NotFound desc = could not find container \"2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5\": container with ID starting with 2d0e1a610e19c0aca11a7862492cc9e72b29664ea5f1dfa0f6dee901ec1b6cc5 not found: ID does not exist" Oct 10 09:16:07 crc kubenswrapper[4732]: I1010 09:16:07.672211 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" path="/var/lib/kubelet/pods/12a73edf-8b1b-4b0a-8ca2-dce3fb775932/volumes" Oct 10 09:16:08 crc kubenswrapper[4732]: I1010 09:16:08.660784 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:16:08 crc kubenswrapper[4732]: I1010 09:16:08.937344 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"eaff4c9179560058a51d4453defd76684d517cb107f480f0e8aec772566d759b"} Oct 10 09:18:04 crc kubenswrapper[4732]: I1010 09:18:04.130760 4732 generic.go:334] "Generic (PLEG): container finished" podID="6dd18d28-5db6-4fff-96a8-320afc9e7638" containerID="1d874af493262bdf7e20e8ce15311ad6846fbe5998e86dff635175e83f7bfdad" exitCode=0 Oct 10 09:18:04 crc kubenswrapper[4732]: I1010 09:18:04.130889 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" event={"ID":"6dd18d28-5db6-4fff-96a8-320afc9e7638","Type":"ContainerDied","Data":"1d874af493262bdf7e20e8ce15311ad6846fbe5998e86dff635175e83f7bfdad"} Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.636966 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.812909 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-agent-neutron-config-0\") pod \"6dd18d28-5db6-4fff-96a8-320afc9e7638\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.813000 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-inventory\") pod \"6dd18d28-5db6-4fff-96a8-320afc9e7638\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.813262 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-combined-ca-bundle\") pod \"6dd18d28-5db6-4fff-96a8-320afc9e7638\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.813328 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4bjk\" (UniqueName: \"kubernetes.io/projected/6dd18d28-5db6-4fff-96a8-320afc9e7638-kube-api-access-x4bjk\") pod \"6dd18d28-5db6-4fff-96a8-320afc9e7638\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.813378 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-ssh-key\") pod \"6dd18d28-5db6-4fff-96a8-320afc9e7638\" (UID: \"6dd18d28-5db6-4fff-96a8-320afc9e7638\") " Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.818907 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd18d28-5db6-4fff-96a8-320afc9e7638-kube-api-access-x4bjk" (OuterVolumeSpecName: "kube-api-access-x4bjk") pod "6dd18d28-5db6-4fff-96a8-320afc9e7638" (UID: "6dd18d28-5db6-4fff-96a8-320afc9e7638"). InnerVolumeSpecName "kube-api-access-x4bjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.819633 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "6dd18d28-5db6-4fff-96a8-320afc9e7638" (UID: "6dd18d28-5db6-4fff-96a8-320afc9e7638"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.847497 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-inventory" (OuterVolumeSpecName: "inventory") pod "6dd18d28-5db6-4fff-96a8-320afc9e7638" (UID: "6dd18d28-5db6-4fff-96a8-320afc9e7638"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.856614 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "6dd18d28-5db6-4fff-96a8-320afc9e7638" (UID: "6dd18d28-5db6-4fff-96a8-320afc9e7638"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.862635 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6dd18d28-5db6-4fff-96a8-320afc9e7638" (UID: "6dd18d28-5db6-4fff-96a8-320afc9e7638"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.920535 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.920642 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.920666 4732 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.920716 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4bjk\" (UniqueName: \"kubernetes.io/projected/6dd18d28-5db6-4fff-96a8-320afc9e7638-kube-api-access-x4bjk\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:05 crc kubenswrapper[4732]: I1010 09:18:05.921053 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6dd18d28-5db6-4fff-96a8-320afc9e7638-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:06 crc kubenswrapper[4732]: I1010 09:18:06.159576 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" event={"ID":"6dd18d28-5db6-4fff-96a8-320afc9e7638","Type":"ContainerDied","Data":"8e3f26ba7c0ccf30802e77a334c392c88be0d763da15650b657fe493d66dd77c"} Oct 10 09:18:06 crc kubenswrapper[4732]: I1010 09:18:06.159621 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e3f26ba7c0ccf30802e77a334c392c88be0d763da15650b657fe493d66dd77c" Oct 10 09:18:06 crc kubenswrapper[4732]: I1010 09:18:06.159653 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-l5xlf" Oct 10 09:18:25 crc kubenswrapper[4732]: I1010 09:18:25.356650 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:18:25 crc kubenswrapper[4732]: I1010 09:18:25.357653 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.184903 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.189220 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="52200daf-815f-40c7-8359-a7140dcd863f" containerName="nova-cell0-conductor-conductor" containerID="cri-o://13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07" gracePeriod=30 Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.202124 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.202409 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="c31709ff-c729-4dbc-a23f-11f545334204" containerName="nova-cell1-conductor-conductor" containerID="cri-o://390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" gracePeriod=30 Oct 10 09:18:35 crc kubenswrapper[4732]: E1010 09:18:35.755360 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 09:18:35 crc kubenswrapper[4732]: E1010 09:18:35.758279 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 09:18:35 crc kubenswrapper[4732]: E1010 09:18:35.762409 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 09:18:35 crc kubenswrapper[4732]: E1010 09:18:35.762464 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="c31709ff-c729-4dbc-a23f-11f545334204" containerName="nova-cell1-conductor-conductor" Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.856776 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.857193 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-log" containerID="cri-o://c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468" gracePeriod=30 Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.857630 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-api" containerID="cri-o://c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f" gracePeriod=30 Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.878073 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.878299 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" containerName="nova-scheduler-scheduler" containerID="cri-o://bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d" gracePeriod=30 Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.890035 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.890276 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-log" containerID="cri-o://fb4f7baff8f2803876f74e2918bc3d30d740ef069aabbb540f9291ef255d5fe1" gracePeriod=30 Oct 10 09:18:35 crc kubenswrapper[4732]: I1010 09:18:35.890609 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-metadata" containerID="cri-o://a48747372aa6f781c05187465d1498190ad8ecb8a1bc0ce35ede767afbc3a5dd" gracePeriod=30 Oct 10 09:18:36 crc kubenswrapper[4732]: E1010 09:18:36.475290 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 09:18:36 crc kubenswrapper[4732]: E1010 09:18:36.476639 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 09:18:36 crc kubenswrapper[4732]: E1010 09:18:36.477882 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 10 09:18:36 crc kubenswrapper[4732]: E1010 09:18:36.477927 4732 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="52200daf-815f-40c7-8359-a7140dcd863f" containerName="nova-cell0-conductor-conductor" Oct 10 09:18:36 crc kubenswrapper[4732]: I1010 09:18:36.501847 4732 generic.go:334] "Generic (PLEG): container finished" podID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerID="fb4f7baff8f2803876f74e2918bc3d30d740ef069aabbb540f9291ef255d5fe1" exitCode=143 Oct 10 09:18:36 crc kubenswrapper[4732]: I1010 09:18:36.501937 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d","Type":"ContainerDied","Data":"fb4f7baff8f2803876f74e2918bc3d30d740ef069aabbb540f9291ef255d5fe1"} Oct 10 09:18:36 crc kubenswrapper[4732]: I1010 09:18:36.503623 4732 generic.go:334] "Generic (PLEG): container finished" podID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerID="c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468" exitCode=143 Oct 10 09:18:36 crc kubenswrapper[4732]: I1010 09:18:36.503666 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f30848-6b41-4978-9b03-afcbb1e9618e","Type":"ContainerDied","Data":"c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468"} Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.281369 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.422602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/c31709ff-c729-4dbc-a23f-11f545334204-kube-api-access-kfg5c\") pod \"c31709ff-c729-4dbc-a23f-11f545334204\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.422991 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-combined-ca-bundle\") pod \"c31709ff-c729-4dbc-a23f-11f545334204\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.423230 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-config-data\") pod \"c31709ff-c729-4dbc-a23f-11f545334204\" (UID: \"c31709ff-c729-4dbc-a23f-11f545334204\") " Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.431836 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31709ff-c729-4dbc-a23f-11f545334204-kube-api-access-kfg5c" (OuterVolumeSpecName: "kube-api-access-kfg5c") pod "c31709ff-c729-4dbc-a23f-11f545334204" (UID: "c31709ff-c729-4dbc-a23f-11f545334204"). InnerVolumeSpecName "kube-api-access-kfg5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.458863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c31709ff-c729-4dbc-a23f-11f545334204" (UID: "c31709ff-c729-4dbc-a23f-11f545334204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.459236 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-config-data" (OuterVolumeSpecName: "config-data") pod "c31709ff-c729-4dbc-a23f-11f545334204" (UID: "c31709ff-c729-4dbc-a23f-11f545334204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.512881 4732 generic.go:334] "Generic (PLEG): container finished" podID="c31709ff-c729-4dbc-a23f-11f545334204" containerID="390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" exitCode=0 Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.512958 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.513951 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c31709ff-c729-4dbc-a23f-11f545334204","Type":"ContainerDied","Data":"390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68"} Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.514047 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c31709ff-c729-4dbc-a23f-11f545334204","Type":"ContainerDied","Data":"7190b578bf25319e3f7632bc8f2aad0d4600ea5f1087e9ff62b0fe9cf203b6a3"} Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.514125 4732 scope.go:117] "RemoveContainer" containerID="390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.526549 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/c31709ff-c729-4dbc-a23f-11f545334204-kube-api-access-kfg5c\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.526583 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.526597 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31709ff-c729-4dbc-a23f-11f545334204-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.557954 4732 scope.go:117] "RemoveContainer" containerID="390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.558403 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68\": container with ID starting with 390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68 not found: ID does not exist" containerID="390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.558449 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68"} err="failed to get container status \"390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68\": rpc error: code = NotFound desc = could not find container \"390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68\": container with ID starting with 390c72ecc1258da23102119a191535555dd90793f62ae628cdc3e7c550ee0a68 not found: ID does not exist" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.568965 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.584800 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.593470 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.593960 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="extract-utilities" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.593982 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="extract-utilities" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.594001 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="extract-utilities" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594009 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="extract-utilities" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.594028 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd18d28-5db6-4fff-96a8-320afc9e7638" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594035 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd18d28-5db6-4fff-96a8-320afc9e7638" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.594053 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31709ff-c729-4dbc-a23f-11f545334204" containerName="nova-cell1-conductor-conductor" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594059 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31709ff-c729-4dbc-a23f-11f545334204" containerName="nova-cell1-conductor-conductor" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.594068 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="registry-server" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594074 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="registry-server" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.594086 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="registry-server" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594092 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="registry-server" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.594102 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="extract-content" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594107 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="extract-content" Oct 10 09:18:37 crc kubenswrapper[4732]: E1010 09:18:37.594116 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="extract-content" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594122 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="extract-content" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594339 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bfe81ad-000c-42dc-b55b-b1c8e34ade96" containerName="registry-server" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594358 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd18d28-5db6-4fff-96a8-320afc9e7638" containerName="neutron-dhcp-openstack-openstack-cell1" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594367 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="12a73edf-8b1b-4b0a-8ca2-dce3fb775932" containerName="registry-server" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.594384 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31709ff-c729-4dbc-a23f-11f545334204" containerName="nova-cell1-conductor-conductor" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.595158 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.597003 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.603808 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.680891 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31709ff-c729-4dbc-a23f-11f545334204" path="/var/lib/kubelet/pods/c31709ff-c729-4dbc-a23f-11f545334204/volumes" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.733957 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdjnq\" (UniqueName: \"kubernetes.io/projected/4b06e190-6280-4b02-9712-214229f1c30f-kube-api-access-jdjnq\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.734011 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b06e190-6280-4b02-9712-214229f1c30f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.734061 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b06e190-6280-4b02-9712-214229f1c30f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.836151 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdjnq\" (UniqueName: \"kubernetes.io/projected/4b06e190-6280-4b02-9712-214229f1c30f-kube-api-access-jdjnq\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.836213 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b06e190-6280-4b02-9712-214229f1c30f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.836243 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b06e190-6280-4b02-9712-214229f1c30f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.839991 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b06e190-6280-4b02-9712-214229f1c30f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.847211 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b06e190-6280-4b02-9712-214229f1c30f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.855980 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdjnq\" (UniqueName: \"kubernetes.io/projected/4b06e190-6280-4b02-9712-214229f1c30f-kube-api-access-jdjnq\") pod \"nova-cell1-conductor-0\" (UID: \"4b06e190-6280-4b02-9712-214229f1c30f\") " pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:37 crc kubenswrapper[4732]: I1010 09:18:37.931661 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:38 crc kubenswrapper[4732]: I1010 09:18:38.248295 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 10 09:18:38 crc kubenswrapper[4732]: I1010 09:18:38.532391 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4b06e190-6280-4b02-9712-214229f1c30f","Type":"ContainerStarted","Data":"fc704c6a927bf0dadde6bf8eaf98152c2d611c19458a7db177262f2dfcfbc388"} Oct 10 09:18:38 crc kubenswrapper[4732]: I1010 09:18:38.532774 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4b06e190-6280-4b02-9712-214229f1c30f","Type":"ContainerStarted","Data":"923e55c6ef95c3a011b3ec71cce46e00f062c939bc41d965f48018b07984eb6b"} Oct 10 09:18:38 crc kubenswrapper[4732]: I1010 09:18:38.533330 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:38 crc kubenswrapper[4732]: I1010 09:18:38.977366 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.003188 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.003168014 podStartE2EDuration="2.003168014s" podCreationTimestamp="2025-10-10 09:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:18:38.551307666 +0000 UTC m=+8845.620898907" watchObservedRunningTime="2025-10-10 09:18:39.003168014 +0000 UTC m=+8846.072759275" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.041977 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.94:8775/\": read tcp 10.217.0.2:53674->10.217.1.94:8775: read: connection reset by peer" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.042020 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.94:8775/\": read tcp 10.217.0.2:53662->10.217.1.94:8775: read: connection reset by peer" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.062631 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk8qq\" (UniqueName: \"kubernetes.io/projected/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-kube-api-access-bk8qq\") pod \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.062672 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-config-data\") pod \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.062906 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-combined-ca-bundle\") pod \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\" (UID: \"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.071320 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-kube-api-access-bk8qq" (OuterVolumeSpecName: "kube-api-access-bk8qq") pod "04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" (UID: "04ac2b28-9f8e-40bb-9b11-1a70ebc6f745"). InnerVolumeSpecName "kube-api-access-bk8qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.094025 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88f30848_6b41_4978_9b03_afcbb1e9618e.slice/crio-c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod651f6808_b9ca_4a88_a7c4_1e3f8b3c5a3d.slice/crio-a48747372aa6f781c05187465d1498190ad8ecb8a1bc0ce35ede767afbc3a5dd.scope\": RecentStats: unable to find data in memory cache]" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.105833 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-config-data" (OuterVolumeSpecName: "config-data") pod "04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" (UID: "04ac2b28-9f8e-40bb-9b11-1a70ebc6f745"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.134386 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" (UID: "04ac2b28-9f8e-40bb-9b11-1a70ebc6f745"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.165750 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.165775 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk8qq\" (UniqueName: \"kubernetes.io/projected/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-kube-api-access-bk8qq\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.165791 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.454549 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.546402 4732 generic.go:334] "Generic (PLEG): container finished" podID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerID="a48747372aa6f781c05187465d1498190ad8ecb8a1bc0ce35ede767afbc3a5dd" exitCode=0 Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.546461 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d","Type":"ContainerDied","Data":"a48747372aa6f781c05187465d1498190ad8ecb8a1bc0ce35ede767afbc3a5dd"} Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.546482 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d","Type":"ContainerDied","Data":"4467d879c7a1594b61df29bd85bd400632968b5a75940da6b16bb7c143115145"} Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.546495 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4467d879c7a1594b61df29bd85bd400632968b5a75940da6b16bb7c143115145" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.549897 4732 generic.go:334] "Generic (PLEG): container finished" podID="04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" containerID="bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d" exitCode=0 Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.549951 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.550009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745","Type":"ContainerDied","Data":"bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d"} Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.550044 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"04ac2b28-9f8e-40bb-9b11-1a70ebc6f745","Type":"ContainerDied","Data":"1d9884f19e00c2cb0a36ebecc3d6966685fa1356cc8e05d5533b1f5e73078a7d"} Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.550066 4732 scope.go:117] "RemoveContainer" containerID="bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.560297 4732 generic.go:334] "Generic (PLEG): container finished" podID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerID="c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f" exitCode=0 Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.560403 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.560415 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f30848-6b41-4978-9b03-afcbb1e9618e","Type":"ContainerDied","Data":"c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f"} Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.560474 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"88f30848-6b41-4978-9b03-afcbb1e9618e","Type":"ContainerDied","Data":"af3b23981bbc4b8a4d1e5685bc06dd84dcdf65636a20c626b0808dc9a484761a"} Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.566219 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.577799 4732 scope.go:117] "RemoveContainer" containerID="bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.579823 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-config-data\") pod \"88f30848-6b41-4978-9b03-afcbb1e9618e\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.579878 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-combined-ca-bundle\") pod \"88f30848-6b41-4978-9b03-afcbb1e9618e\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.579901 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-internal-tls-certs\") pod \"88f30848-6b41-4978-9b03-afcbb1e9618e\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.579943 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-public-tls-certs\") pod \"88f30848-6b41-4978-9b03-afcbb1e9618e\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.580017 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgs7h\" (UniqueName: \"kubernetes.io/projected/88f30848-6b41-4978-9b03-afcbb1e9618e-kube-api-access-qgs7h\") pod \"88f30848-6b41-4978-9b03-afcbb1e9618e\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.580040 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f30848-6b41-4978-9b03-afcbb1e9618e-logs\") pod \"88f30848-6b41-4978-9b03-afcbb1e9618e\" (UID: \"88f30848-6b41-4978-9b03-afcbb1e9618e\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.581362 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f30848-6b41-4978-9b03-afcbb1e9618e-logs" (OuterVolumeSpecName: "logs") pod "88f30848-6b41-4978-9b03-afcbb1e9618e" (UID: "88f30848-6b41-4978-9b03-afcbb1e9618e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.585214 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d\": container with ID starting with bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d not found: ID does not exist" containerID="bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.585269 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d"} err="failed to get container status \"bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d\": rpc error: code = NotFound desc = could not find container \"bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d\": container with ID starting with bb3c19f105f7b7d0242675f810c635cf2d9672ca6eef27c7cbf762417474676d not found: ID does not exist" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.585294 4732 scope.go:117] "RemoveContainer" containerID="c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.588431 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.596109 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88f30848-6b41-4978-9b03-afcbb1e9618e-kube-api-access-qgs7h" (OuterVolumeSpecName: "kube-api-access-qgs7h") pod "88f30848-6b41-4978-9b03-afcbb1e9618e" (UID: "88f30848-6b41-4978-9b03-afcbb1e9618e"). InnerVolumeSpecName "kube-api-access-qgs7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.617823 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88f30848-6b41-4978-9b03-afcbb1e9618e" (UID: "88f30848-6b41-4978-9b03-afcbb1e9618e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.621371 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.641547 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.642008 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-metadata" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642025 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-metadata" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.642042 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-log" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642048 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-log" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.642071 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-api" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642077 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-api" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.642094 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-log" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642100 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-log" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.642115 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" containerName="nova-scheduler-scheduler" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642121 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" containerName="nova-scheduler-scheduler" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642317 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-metadata" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642339 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" containerName="nova-metadata-log" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642347 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" containerName="nova-scheduler-scheduler" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642369 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-log" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.642378 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" containerName="nova-api-api" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.643148 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-config-data" (OuterVolumeSpecName: "config-data") pod "88f30848-6b41-4978-9b03-afcbb1e9618e" (UID: "88f30848-6b41-4978-9b03-afcbb1e9618e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.643434 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.645391 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.649845 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.669038 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88f30848-6b41-4978-9b03-afcbb1e9618e" (UID: "88f30848-6b41-4978-9b03-afcbb1e9618e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.671045 4732 scope.go:117] "RemoveContainer" containerID="c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.680215 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04ac2b28-9f8e-40bb-9b11-1a70ebc6f745" path="/var/lib/kubelet/pods/04ac2b28-9f8e-40bb-9b11-1a70ebc6f745/volumes" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.682084 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7kjb\" (UniqueName: \"kubernetes.io/projected/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-kube-api-access-j7kjb\") pod \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.682290 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-logs\") pod \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.682333 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-nova-metadata-tls-certs\") pod \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.682361 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-config-data\") pod \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.682700 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-logs" (OuterVolumeSpecName: "logs") pod "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" (UID: "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.683009 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-combined-ca-bundle\") pod \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\" (UID: \"651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d\") " Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.685731 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-kube-api-access-j7kjb" (OuterVolumeSpecName: "kube-api-access-j7kjb") pod "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" (UID: "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d"). InnerVolumeSpecName "kube-api-access-j7kjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.686231 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-logs\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.686252 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.686262 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.686290 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.686305 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgs7h\" (UniqueName: \"kubernetes.io/projected/88f30848-6b41-4978-9b03-afcbb1e9618e-kube-api-access-qgs7h\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.686316 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88f30848-6b41-4978-9b03-afcbb1e9618e-logs\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.687701 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88f30848-6b41-4978-9b03-afcbb1e9618e" (UID: "88f30848-6b41-4978-9b03-afcbb1e9618e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.704536 4732 scope.go:117] "RemoveContainer" containerID="c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.705075 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f\": container with ID starting with c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f not found: ID does not exist" containerID="c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.705202 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f"} err="failed to get container status \"c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f\": rpc error: code = NotFound desc = could not find container \"c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f\": container with ID starting with c34739f441fdab406a4924c25e551b0da24a6fc865f25632c70be67abaa2444f not found: ID does not exist" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.705287 4732 scope.go:117] "RemoveContainer" containerID="c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468" Oct 10 09:18:39 crc kubenswrapper[4732]: E1010 09:18:39.706248 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468\": container with ID starting with c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468 not found: ID does not exist" containerID="c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.706341 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468"} err="failed to get container status \"c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468\": rpc error: code = NotFound desc = could not find container \"c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468\": container with ID starting with c9fa7ef3e4e97dd7ced76f83b07949642bb8f128e5a0507e9aa4adcdec970468 not found: ID does not exist" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.718932 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" (UID: "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.737923 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" (UID: "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.742973 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-config-data" (OuterVolumeSpecName: "config-data") pod "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" (UID: "651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.787685 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pnnb\" (UniqueName: \"kubernetes.io/projected/c9e2f176-f337-4dda-af59-d839d8985489-kube-api-access-9pnnb\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.788154 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2f176-f337-4dda-af59-d839d8985489-config-data\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.788199 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2f176-f337-4dda-af59-d839d8985489-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.788250 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88f30848-6b41-4978-9b03-afcbb1e9618e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.788262 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7kjb\" (UniqueName: \"kubernetes.io/projected/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-kube-api-access-j7kjb\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.788271 4732 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.788280 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.788288 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.891353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2f176-f337-4dda-af59-d839d8985489-config-data\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.891417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2f176-f337-4dda-af59-d839d8985489-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.891468 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pnnb\" (UniqueName: \"kubernetes.io/projected/c9e2f176-f337-4dda-af59-d839d8985489-kube-api-access-9pnnb\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.895357 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e2f176-f337-4dda-af59-d839d8985489-config-data\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.905040 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.910322 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e2f176-f337-4dda-af59-d839d8985489-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.913574 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.932152 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pnnb\" (UniqueName: \"kubernetes.io/projected/c9e2f176-f337-4dda-af59-d839d8985489-kube-api-access-9pnnb\") pod \"nova-scheduler-0\" (UID: \"c9e2f176-f337-4dda-af59-d839d8985489\") " pod="openstack/nova-scheduler-0" Oct 10 09:18:39 crc kubenswrapper[4732]: I1010 09:18:39.948585 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.000542 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.019195 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.024931 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.026823 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.027072 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.028084 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.208017 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dd5777-e881-497c-845b-97a5d608989c-logs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.208275 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.208297 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mq5\" (UniqueName: \"kubernetes.io/projected/84dd5777-e881-497c-845b-97a5d608989c-kube-api-access-d2mq5\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.208397 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.208468 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-config-data\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.208485 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-public-tls-certs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.313786 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.313846 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-config-data\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.313870 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-public-tls-certs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.314009 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dd5777-e881-497c-845b-97a5d608989c-logs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.314033 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.314050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mq5\" (UniqueName: \"kubernetes.io/projected/84dd5777-e881-497c-845b-97a5d608989c-kube-api-access-d2mq5\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.315165 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84dd5777-e881-497c-845b-97a5d608989c-logs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.322194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-public-tls-certs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.322578 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-config-data\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.323917 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.323913 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84dd5777-e881-497c-845b-97a5d608989c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.332116 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mq5\" (UniqueName: \"kubernetes.io/projected/84dd5777-e881-497c-845b-97a5d608989c-kube-api-access-d2mq5\") pod \"nova-api-0\" (UID: \"84dd5777-e881-497c-845b-97a5d608989c\") " pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.579139 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.592993 4732 generic.go:334] "Generic (PLEG): container finished" podID="52200daf-815f-40c7-8359-a7140dcd863f" containerID="13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07" exitCode=0 Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.593071 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"52200daf-815f-40c7-8359-a7140dcd863f","Type":"ContainerDied","Data":"13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07"} Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.599313 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.601341 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.684983 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.710985 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.731312 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.746649 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 10 09:18:40 crc kubenswrapper[4732]: E1010 09:18:40.747087 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52200daf-815f-40c7-8359-a7140dcd863f" containerName="nova-cell0-conductor-conductor" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.747098 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="52200daf-815f-40c7-8359-a7140dcd863f" containerName="nova-cell0-conductor-conductor" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.747281 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="52200daf-815f-40c7-8359-a7140dcd863f" containerName="nova-cell0-conductor-conductor" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.748314 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.752384 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.752704 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.769211 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.830151 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85qmc\" (UniqueName: \"kubernetes.io/projected/52200daf-815f-40c7-8359-a7140dcd863f-kube-api-access-85qmc\") pod \"52200daf-815f-40c7-8359-a7140dcd863f\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.830466 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-combined-ca-bundle\") pod \"52200daf-815f-40c7-8359-a7140dcd863f\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.830530 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-config-data\") pod \"52200daf-815f-40c7-8359-a7140dcd863f\" (UID: \"52200daf-815f-40c7-8359-a7140dcd863f\") " Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.830907 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wd2\" (UniqueName: \"kubernetes.io/projected/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-kube-api-access-t6wd2\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.830982 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.831110 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-config-data\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.831172 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.831195 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-logs\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.839846 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52200daf-815f-40c7-8359-a7140dcd863f-kube-api-access-85qmc" (OuterVolumeSpecName: "kube-api-access-85qmc") pod "52200daf-815f-40c7-8359-a7140dcd863f" (UID: "52200daf-815f-40c7-8359-a7140dcd863f"). InnerVolumeSpecName "kube-api-access-85qmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.858221 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52200daf-815f-40c7-8359-a7140dcd863f" (UID: "52200daf-815f-40c7-8359-a7140dcd863f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.858800 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-config-data" (OuterVolumeSpecName: "config-data") pod "52200daf-815f-40c7-8359-a7140dcd863f" (UID: "52200daf-815f-40c7-8359-a7140dcd863f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.935792 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wd2\" (UniqueName: \"kubernetes.io/projected/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-kube-api-access-t6wd2\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.935893 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.935998 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-config-data\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.936070 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-logs\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.936092 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.936567 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-logs\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.936680 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.936719 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52200daf-815f-40c7-8359-a7140dcd863f-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.936729 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85qmc\" (UniqueName: \"kubernetes.io/projected/52200daf-815f-40c7-8359-a7140dcd863f-kube-api-access-85qmc\") on node \"crc\" DevicePath \"\"" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.943429 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.943465 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.943540 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-config-data\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:40 crc kubenswrapper[4732]: I1010 09:18:40.950661 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wd2\" (UniqueName: \"kubernetes.io/projected/ff6afe7c-aa67-4af3-9de1-f1046d7ea386-kube-api-access-t6wd2\") pod \"nova-metadata-0\" (UID: \"ff6afe7c-aa67-4af3-9de1-f1046d7ea386\") " pod="openstack/nova-metadata-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.081856 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.141592 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.612556 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9e2f176-f337-4dda-af59-d839d8985489","Type":"ContainerStarted","Data":"f6876bc5a36e1af0d7a853abeb74646182d2af04039b40d6aac30c1cc3f15f13"} Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.613723 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9e2f176-f337-4dda-af59-d839d8985489","Type":"ContainerStarted","Data":"0aafc030119d218f12bc87d48860d84d9ead86c41a73828b2dc493eb91913f98"} Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.620855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84dd5777-e881-497c-845b-97a5d608989c","Type":"ContainerStarted","Data":"7717e2d0183c89d52cd00940ce1398299a188fb6bb407dfbad06cabd0b5ec5b7"} Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.620890 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84dd5777-e881-497c-845b-97a5d608989c","Type":"ContainerStarted","Data":"70f912863fd18478d246acd67890d3c25a5a19f0071616de3864532a68651d53"} Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.622435 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"52200daf-815f-40c7-8359-a7140dcd863f","Type":"ContainerDied","Data":"a3001508cbd31368e098021f38536f33a7f688fe8038815b6d2db0b1525adbfc"} Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.622492 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.622531 4732 scope.go:117] "RemoveContainer" containerID="13ce2fe8722ac2c2fbe8261363664f265232cf4a4568d322dfeaec61e3d7ca07" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.653596 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.654257 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.654237091 podStartE2EDuration="2.654237091s" podCreationTimestamp="2025-10-10 09:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:18:41.637130744 +0000 UTC m=+8848.706721995" watchObservedRunningTime="2025-10-10 09:18:41.654237091 +0000 UTC m=+8848.723828332" Oct 10 09:18:41 crc kubenswrapper[4732]: W1010 09:18:41.656411 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff6afe7c_aa67_4af3_9de1_f1046d7ea386.slice/crio-13a5bdde1c4132e6f3bdc753a6d584b77cff8639504ccea4547c46ede5c6e060 WatchSource:0}: Error finding container 13a5bdde1c4132e6f3bdc753a6d584b77cff8639504ccea4547c46ede5c6e060: Status 404 returned error can't find the container with id 13a5bdde1c4132e6f3bdc753a6d584b77cff8639504ccea4547c46ede5c6e060 Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.679417 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d" path="/var/lib/kubelet/pods/651f6808-b9ca-4a88-a7c4-1e3f8b3c5a3d/volumes" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.680523 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f30848-6b41-4978-9b03-afcbb1e9618e" path="/var/lib/kubelet/pods/88f30848-6b41-4978-9b03-afcbb1e9618e/volumes" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.686779 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.695935 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.730216 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.731426 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.733206 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.742361 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.854720 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3643c9df-0a5f-4b91-a164-a61498f3725c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.854787 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6pv\" (UniqueName: \"kubernetes.io/projected/3643c9df-0a5f-4b91-a164-a61498f3725c-kube-api-access-nq6pv\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.854828 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3643c9df-0a5f-4b91-a164-a61498f3725c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.956731 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6pv\" (UniqueName: \"kubernetes.io/projected/3643c9df-0a5f-4b91-a164-a61498f3725c-kube-api-access-nq6pv\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.956793 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3643c9df-0a5f-4b91-a164-a61498f3725c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.956914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3643c9df-0a5f-4b91-a164-a61498f3725c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.964289 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3643c9df-0a5f-4b91-a164-a61498f3725c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.965175 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3643c9df-0a5f-4b91-a164-a61498f3725c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:41 crc kubenswrapper[4732]: I1010 09:18:41.973972 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6pv\" (UniqueName: \"kubernetes.io/projected/3643c9df-0a5f-4b91-a164-a61498f3725c-kube-api-access-nq6pv\") pod \"nova-cell0-conductor-0\" (UID: \"3643c9df-0a5f-4b91-a164-a61498f3725c\") " pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.139603 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.605280 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 10 09:18:42 crc kubenswrapper[4732]: W1010 09:18:42.608889 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3643c9df_0a5f_4b91_a164_a61498f3725c.slice/crio-8bcd0891a864e8aff8a36aff5c3245281f562d2cb781a327d6fe5bcaf5639358 WatchSource:0}: Error finding container 8bcd0891a864e8aff8a36aff5c3245281f562d2cb781a327d6fe5bcaf5639358: Status 404 returned error can't find the container with id 8bcd0891a864e8aff8a36aff5c3245281f562d2cb781a327d6fe5bcaf5639358 Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.632535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"84dd5777-e881-497c-845b-97a5d608989c","Type":"ContainerStarted","Data":"f74d7efe9635882b258b4b73fa64377aa669e90120a77f0c9cf202b4cd0e8337"} Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.639936 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff6afe7c-aa67-4af3-9de1-f1046d7ea386","Type":"ContainerStarted","Data":"800f51db83ee99918b43035562caec755bc6d7cd82c0b14fd2b6d6c2f388ac65"} Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.639985 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff6afe7c-aa67-4af3-9de1-f1046d7ea386","Type":"ContainerStarted","Data":"6961950791ecd60c9b2e74f65ff723f551a51ede5a8084249e9809d91d3cd541"} Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.640001 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ff6afe7c-aa67-4af3-9de1-f1046d7ea386","Type":"ContainerStarted","Data":"13a5bdde1c4132e6f3bdc753a6d584b77cff8639504ccea4547c46ede5c6e060"} Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.643469 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3643c9df-0a5f-4b91-a164-a61498f3725c","Type":"ContainerStarted","Data":"8bcd0891a864e8aff8a36aff5c3245281f562d2cb781a327d6fe5bcaf5639358"} Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.665181 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.665161831 podStartE2EDuration="3.665161831s" podCreationTimestamp="2025-10-10 09:18:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:18:42.656510345 +0000 UTC m=+8849.726101596" watchObservedRunningTime="2025-10-10 09:18:42.665161831 +0000 UTC m=+8849.734753072" Oct 10 09:18:42 crc kubenswrapper[4732]: I1010 09:18:42.699445 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.699424826 podStartE2EDuration="2.699424826s" podCreationTimestamp="2025-10-10 09:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:18:42.676455609 +0000 UTC m=+8849.746046880" watchObservedRunningTime="2025-10-10 09:18:42.699424826 +0000 UTC m=+8849.769016067" Oct 10 09:18:43 crc kubenswrapper[4732]: I1010 09:18:43.684814 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52200daf-815f-40c7-8359-a7140dcd863f" path="/var/lib/kubelet/pods/52200daf-815f-40c7-8359-a7140dcd863f/volumes" Oct 10 09:18:43 crc kubenswrapper[4732]: I1010 09:18:43.686349 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3643c9df-0a5f-4b91-a164-a61498f3725c","Type":"ContainerStarted","Data":"e9ca2120e9adaf76e37e5b1fbccf9aa776bf76ef89b8025c74e7cf7bd1055ca5"} Oct 10 09:18:43 crc kubenswrapper[4732]: I1010 09:18:43.739781 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.739750918 podStartE2EDuration="2.739750918s" podCreationTimestamp="2025-10-10 09:18:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:18:43.720184624 +0000 UTC m=+8850.789775895" watchObservedRunningTime="2025-10-10 09:18:43.739750918 +0000 UTC m=+8850.809342199" Oct 10 09:18:44 crc kubenswrapper[4732]: I1010 09:18:44.673943 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:45 crc kubenswrapper[4732]: I1010 09:18:45.002364 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 10 09:18:46 crc kubenswrapper[4732]: I1010 09:18:46.082820 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 09:18:46 crc kubenswrapper[4732]: I1010 09:18:46.083258 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 10 09:18:47 crc kubenswrapper[4732]: I1010 09:18:47.179868 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 10 09:18:47 crc kubenswrapper[4732]: I1010 09:18:47.968123 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 10 09:18:50 crc kubenswrapper[4732]: I1010 09:18:50.002152 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 10 09:18:50 crc kubenswrapper[4732]: I1010 09:18:50.044800 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 10 09:18:50 crc kubenswrapper[4732]: I1010 09:18:50.603167 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 09:18:50 crc kubenswrapper[4732]: I1010 09:18:50.603231 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 10 09:18:50 crc kubenswrapper[4732]: I1010 09:18:50.768433 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 10 09:18:51 crc kubenswrapper[4732]: I1010 09:18:51.082974 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 09:18:51 crc kubenswrapper[4732]: I1010 09:18:51.083320 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 10 09:18:51 crc kubenswrapper[4732]: I1010 09:18:51.616831 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84dd5777-e881-497c-845b-97a5d608989c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 09:18:51 crc kubenswrapper[4732]: I1010 09:18:51.616831 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="84dd5777-e881-497c-845b-97a5d608989c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 09:18:52 crc kubenswrapper[4732]: I1010 09:18:52.098842 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff6afe7c-aa67-4af3-9de1-f1046d7ea386" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.187:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 10 09:18:52 crc kubenswrapper[4732]: I1010 09:18:52.098865 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ff6afe7c-aa67-4af3-9de1-f1046d7ea386" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.187:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 10 09:18:55 crc kubenswrapper[4732]: I1010 09:18:55.356796 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:18:55 crc kubenswrapper[4732]: I1010 09:18:55.357288 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:19:00 crc kubenswrapper[4732]: I1010 09:19:00.612803 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 09:19:00 crc kubenswrapper[4732]: I1010 09:19:00.614669 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 09:19:00 crc kubenswrapper[4732]: I1010 09:19:00.619106 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 10 09:19:00 crc kubenswrapper[4732]: I1010 09:19:00.623580 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 09:19:00 crc kubenswrapper[4732]: I1010 09:19:00.828395 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 10 09:19:00 crc kubenswrapper[4732]: I1010 09:19:00.835674 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 10 09:19:01 crc kubenswrapper[4732]: I1010 09:19:01.087960 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 09:19:01 crc kubenswrapper[4732]: I1010 09:19:01.088805 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 10 09:19:01 crc kubenswrapper[4732]: I1010 09:19:01.093971 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 09:19:01 crc kubenswrapper[4732]: I1010 09:19:01.840296 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.810870 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn"] Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.812158 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.814728 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.815446 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.815520 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.815457 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-fdz9b" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.816047 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.816084 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.816048 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.831672 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn"] Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.920895 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.920976 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.921006 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.921037 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhj5\" (UniqueName: \"kubernetes.io/projected/b0c399d3-48d7-4316-931f-2115e341ce3d-kube-api-access-8rhj5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.921062 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.921164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.921267 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.921298 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:02 crc kubenswrapper[4732]: I1010 09:19:02.921408 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.023967 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.024339 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.024464 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.024564 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.024630 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.024652 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.025808 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.025901 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhj5\" (UniqueName: \"kubernetes.io/projected/b0c399d3-48d7-4316-931f-2115e341ce3d-kube-api-access-8rhj5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.026295 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.026393 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.043270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.043276 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.043404 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.045185 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.045352 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.054286 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.054352 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.058675 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhj5\" (UniqueName: \"kubernetes.io/projected/b0c399d3-48d7-4316-931f-2115e341ce3d-kube-api-access-8rhj5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.135455 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.719593 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn"] Oct 10 09:19:03 crc kubenswrapper[4732]: I1010 09:19:03.856963 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" event={"ID":"b0c399d3-48d7-4316-931f-2115e341ce3d","Type":"ContainerStarted","Data":"6cea1e38e8c2e405d835dd3611e995c5a168f49c05ccf41d2c315abd4ec8eab2"} Oct 10 09:19:04 crc kubenswrapper[4732]: I1010 09:19:04.867535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" event={"ID":"b0c399d3-48d7-4316-931f-2115e341ce3d","Type":"ContainerStarted","Data":"3c8941daa6e33058484f13eaf50cd0b410079b993357cd93d11c4a3fd5b4a6c7"} Oct 10 09:19:04 crc kubenswrapper[4732]: I1010 09:19:04.885448 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" podStartSLOduration=2.232024193 podStartE2EDuration="2.88542282s" podCreationTimestamp="2025-10-10 09:19:02 +0000 UTC" firstStartedPulling="2025-10-10 09:19:03.71290161 +0000 UTC m=+8870.782492871" lastFinishedPulling="2025-10-10 09:19:04.366300257 +0000 UTC m=+8871.435891498" observedRunningTime="2025-10-10 09:19:04.883804356 +0000 UTC m=+8871.953395627" watchObservedRunningTime="2025-10-10 09:19:04.88542282 +0000 UTC m=+8871.955014061" Oct 10 09:19:25 crc kubenswrapper[4732]: I1010 09:19:25.356574 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:19:25 crc kubenswrapper[4732]: I1010 09:19:25.357113 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:19:25 crc kubenswrapper[4732]: I1010 09:19:25.357150 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:19:25 crc kubenswrapper[4732]: I1010 09:19:25.357832 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eaff4c9179560058a51d4453defd76684d517cb107f480f0e8aec772566d759b"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:19:25 crc kubenswrapper[4732]: I1010 09:19:25.357881 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://eaff4c9179560058a51d4453defd76684d517cb107f480f0e8aec772566d759b" gracePeriod=600 Oct 10 09:19:26 crc kubenswrapper[4732]: I1010 09:19:26.135183 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="eaff4c9179560058a51d4453defd76684d517cb107f480f0e8aec772566d759b" exitCode=0 Oct 10 09:19:26 crc kubenswrapper[4732]: I1010 09:19:26.135347 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"eaff4c9179560058a51d4453defd76684d517cb107f480f0e8aec772566d759b"} Oct 10 09:19:26 crc kubenswrapper[4732]: I1010 09:19:26.135925 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d"} Oct 10 09:19:26 crc kubenswrapper[4732]: I1010 09:19:26.135973 4732 scope.go:117] "RemoveContainer" containerID="10bafe73b1e49f47fc2595b7b44c86e6f7431a5f35fafb7fbc2ad2cf6325c675" Oct 10 09:19:38 crc kubenswrapper[4732]: I1010 09:19:38.819901 4732 scope.go:117] "RemoveContainer" containerID="a48747372aa6f781c05187465d1498190ad8ecb8a1bc0ce35ede767afbc3a5dd" Oct 10 09:19:38 crc kubenswrapper[4732]: I1010 09:19:38.844615 4732 scope.go:117] "RemoveContainer" containerID="fb4f7baff8f2803876f74e2918bc3d30d740ef069aabbb540f9291ef255d5fe1" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.053762 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mhnhk"] Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.056885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.065119 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhnhk"] Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.214963 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgfsd\" (UniqueName: \"kubernetes.io/projected/4911ee6f-f3d7-4589-bc20-f32c5f875595-kube-api-access-jgfsd\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.215022 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-utilities\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.215268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-catalog-content\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.317640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-utilities\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.317742 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-catalog-content\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.317872 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgfsd\" (UniqueName: \"kubernetes.io/projected/4911ee6f-f3d7-4589-bc20-f32c5f875595-kube-api-access-jgfsd\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.318214 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-utilities\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.318518 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-catalog-content\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.337422 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgfsd\" (UniqueName: \"kubernetes.io/projected/4911ee6f-f3d7-4589-bc20-f32c5f875595-kube-api-access-jgfsd\") pod \"redhat-operators-mhnhk\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.383040 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:44 crc kubenswrapper[4732]: I1010 09:19:44.649118 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mhnhk"] Oct 10 09:19:45 crc kubenswrapper[4732]: I1010 09:19:45.365093 4732 generic.go:334] "Generic (PLEG): container finished" podID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerID="be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f" exitCode=0 Oct 10 09:19:45 crc kubenswrapper[4732]: I1010 09:19:45.365211 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhnhk" event={"ID":"4911ee6f-f3d7-4589-bc20-f32c5f875595","Type":"ContainerDied","Data":"be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f"} Oct 10 09:19:45 crc kubenswrapper[4732]: I1010 09:19:45.365444 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhnhk" event={"ID":"4911ee6f-f3d7-4589-bc20-f32c5f875595","Type":"ContainerStarted","Data":"a42f9030f66f3ce2b353c79f7c5a90ece2e694c1c319af2055a5176640b3d739"} Oct 10 09:19:46 crc kubenswrapper[4732]: I1010 09:19:46.376734 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhnhk" event={"ID":"4911ee6f-f3d7-4589-bc20-f32c5f875595","Type":"ContainerStarted","Data":"ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f"} Oct 10 09:19:48 crc kubenswrapper[4732]: I1010 09:19:48.395856 4732 generic.go:334] "Generic (PLEG): container finished" podID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerID="ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f" exitCode=0 Oct 10 09:19:48 crc kubenswrapper[4732]: I1010 09:19:48.396159 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhnhk" event={"ID":"4911ee6f-f3d7-4589-bc20-f32c5f875595","Type":"ContainerDied","Data":"ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f"} Oct 10 09:19:49 crc kubenswrapper[4732]: I1010 09:19:49.406935 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhnhk" event={"ID":"4911ee6f-f3d7-4589-bc20-f32c5f875595","Type":"ContainerStarted","Data":"ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208"} Oct 10 09:19:49 crc kubenswrapper[4732]: I1010 09:19:49.427557 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mhnhk" podStartSLOduration=1.9275663619999999 podStartE2EDuration="5.4275402s" podCreationTimestamp="2025-10-10 09:19:44 +0000 UTC" firstStartedPulling="2025-10-10 09:19:45.367914643 +0000 UTC m=+8912.437505874" lastFinishedPulling="2025-10-10 09:19:48.867888471 +0000 UTC m=+8915.937479712" observedRunningTime="2025-10-10 09:19:49.423377386 +0000 UTC m=+8916.492968647" watchObservedRunningTime="2025-10-10 09:19:49.4275402 +0000 UTC m=+8916.497131441" Oct 10 09:19:54 crc kubenswrapper[4732]: I1010 09:19:54.385848 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:54 crc kubenswrapper[4732]: I1010 09:19:54.386412 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:54 crc kubenswrapper[4732]: I1010 09:19:54.432060 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:54 crc kubenswrapper[4732]: I1010 09:19:54.498849 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:54 crc kubenswrapper[4732]: I1010 09:19:54.667615 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhnhk"] Oct 10 09:19:56 crc kubenswrapper[4732]: I1010 09:19:56.476007 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mhnhk" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="registry-server" containerID="cri-o://ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208" gracePeriod=2 Oct 10 09:19:56 crc kubenswrapper[4732]: I1010 09:19:56.975933 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.091977 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-utilities\") pod \"4911ee6f-f3d7-4589-bc20-f32c5f875595\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.092154 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgfsd\" (UniqueName: \"kubernetes.io/projected/4911ee6f-f3d7-4589-bc20-f32c5f875595-kube-api-access-jgfsd\") pod \"4911ee6f-f3d7-4589-bc20-f32c5f875595\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.092310 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-catalog-content\") pod \"4911ee6f-f3d7-4589-bc20-f32c5f875595\" (UID: \"4911ee6f-f3d7-4589-bc20-f32c5f875595\") " Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.093966 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-utilities" (OuterVolumeSpecName: "utilities") pod "4911ee6f-f3d7-4589-bc20-f32c5f875595" (UID: "4911ee6f-f3d7-4589-bc20-f32c5f875595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.095195 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.099049 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4911ee6f-f3d7-4589-bc20-f32c5f875595-kube-api-access-jgfsd" (OuterVolumeSpecName: "kube-api-access-jgfsd") pod "4911ee6f-f3d7-4589-bc20-f32c5f875595" (UID: "4911ee6f-f3d7-4589-bc20-f32c5f875595"). InnerVolumeSpecName "kube-api-access-jgfsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.191552 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4911ee6f-f3d7-4589-bc20-f32c5f875595" (UID: "4911ee6f-f3d7-4589-bc20-f32c5f875595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.197313 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgfsd\" (UniqueName: \"kubernetes.io/projected/4911ee6f-f3d7-4589-bc20-f32c5f875595-kube-api-access-jgfsd\") on node \"crc\" DevicePath \"\"" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.197380 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4911ee6f-f3d7-4589-bc20-f32c5f875595-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.488344 4732 generic.go:334] "Generic (PLEG): container finished" podID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerID="ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208" exitCode=0 Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.488386 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhnhk" event={"ID":"4911ee6f-f3d7-4589-bc20-f32c5f875595","Type":"ContainerDied","Data":"ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208"} Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.488420 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mhnhk" event={"ID":"4911ee6f-f3d7-4589-bc20-f32c5f875595","Type":"ContainerDied","Data":"a42f9030f66f3ce2b353c79f7c5a90ece2e694c1c319af2055a5176640b3d739"} Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.488466 4732 scope.go:117] "RemoveContainer" containerID="ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.488460 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mhnhk" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.524903 4732 scope.go:117] "RemoveContainer" containerID="ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.540943 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mhnhk"] Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.550164 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mhnhk"] Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.560191 4732 scope.go:117] "RemoveContainer" containerID="be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.590553 4732 scope.go:117] "RemoveContainer" containerID="ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208" Oct 10 09:19:57 crc kubenswrapper[4732]: E1010 09:19:57.590982 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208\": container with ID starting with ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208 not found: ID does not exist" containerID="ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.591026 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208"} err="failed to get container status \"ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208\": rpc error: code = NotFound desc = could not find container \"ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208\": container with ID starting with ab3d0694d3021b8c8f172c5c52f45e7fcbfba3226919815ee9c860d2d807e208 not found: ID does not exist" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.591092 4732 scope.go:117] "RemoveContainer" containerID="ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f" Oct 10 09:19:57 crc kubenswrapper[4732]: E1010 09:19:57.591516 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f\": container with ID starting with ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f not found: ID does not exist" containerID="ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.591545 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f"} err="failed to get container status \"ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f\": rpc error: code = NotFound desc = could not find container \"ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f\": container with ID starting with ce969ad0a075a64da6bc0ae7fc320189ed82c8a698ffe907fe8d046f3de0585f not found: ID does not exist" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.591564 4732 scope.go:117] "RemoveContainer" containerID="be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f" Oct 10 09:19:57 crc kubenswrapper[4732]: E1010 09:19:57.591906 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f\": container with ID starting with be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f not found: ID does not exist" containerID="be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.591935 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f"} err="failed to get container status \"be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f\": rpc error: code = NotFound desc = could not find container \"be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f\": container with ID starting with be4cb6197370f1449190d5d200e2f896a74b77f59bfd1bbde30dfaeab4b6b07f not found: ID does not exist" Oct 10 09:19:57 crc kubenswrapper[4732]: I1010 09:19:57.670279 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" path="/var/lib/kubelet/pods/4911ee6f-f3d7-4589-bc20-f32c5f875595/volumes" Oct 10 09:21:25 crc kubenswrapper[4732]: I1010 09:21:25.359922 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:21:25 crc kubenswrapper[4732]: I1010 09:21:25.362987 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:21:55 crc kubenswrapper[4732]: I1010 09:21:55.356380 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:21:55 crc kubenswrapper[4732]: I1010 09:21:55.357072 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:22:25 crc kubenswrapper[4732]: I1010 09:22:25.356555 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:22:25 crc kubenswrapper[4732]: I1010 09:22:25.357237 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:22:25 crc kubenswrapper[4732]: I1010 09:22:25.357291 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:22:25 crc kubenswrapper[4732]: I1010 09:22:25.358183 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:22:25 crc kubenswrapper[4732]: I1010 09:22:25.358262 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" gracePeriod=600 Oct 10 09:22:25 crc kubenswrapper[4732]: E1010 09:22:25.480593 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:22:26 crc kubenswrapper[4732]: I1010 09:22:26.057802 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" exitCode=0 Oct 10 09:22:26 crc kubenswrapper[4732]: I1010 09:22:26.057975 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d"} Oct 10 09:22:26 crc kubenswrapper[4732]: I1010 09:22:26.058123 4732 scope.go:117] "RemoveContainer" containerID="eaff4c9179560058a51d4453defd76684d517cb107f480f0e8aec772566d759b" Oct 10 09:22:26 crc kubenswrapper[4732]: I1010 09:22:26.058833 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:22:26 crc kubenswrapper[4732]: E1010 09:22:26.059114 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:22:37 crc kubenswrapper[4732]: I1010 09:22:37.660393 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:22:37 crc kubenswrapper[4732]: E1010 09:22:37.661651 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:22:49 crc kubenswrapper[4732]: I1010 09:22:49.660110 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:22:49 crc kubenswrapper[4732]: E1010 09:22:49.661087 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:23:00 crc kubenswrapper[4732]: I1010 09:23:00.661572 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:23:00 crc kubenswrapper[4732]: E1010 09:23:00.662734 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:23:14 crc kubenswrapper[4732]: I1010 09:23:14.660104 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:23:14 crc kubenswrapper[4732]: E1010 09:23:14.660880 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:23:26 crc kubenswrapper[4732]: I1010 09:23:26.661165 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:23:26 crc kubenswrapper[4732]: E1010 09:23:26.662308 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:23:38 crc kubenswrapper[4732]: I1010 09:23:38.659954 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:23:38 crc kubenswrapper[4732]: E1010 09:23:38.660731 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:23:52 crc kubenswrapper[4732]: I1010 09:23:52.660954 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:23:52 crc kubenswrapper[4732]: E1010 09:23:52.662103 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:24:07 crc kubenswrapper[4732]: I1010 09:24:07.660964 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:24:07 crc kubenswrapper[4732]: E1010 09:24:07.662003 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:24:20 crc kubenswrapper[4732]: I1010 09:24:20.661496 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:24:20 crc kubenswrapper[4732]: E1010 09:24:20.662671 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:24:31 crc kubenswrapper[4732]: I1010 09:24:31.661310 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:24:31 crc kubenswrapper[4732]: E1010 09:24:31.662548 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:24:42 crc kubenswrapper[4732]: I1010 09:24:42.660371 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:24:42 crc kubenswrapper[4732]: E1010 09:24:42.661264 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:24:57 crc kubenswrapper[4732]: I1010 09:24:57.660934 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:24:57 crc kubenswrapper[4732]: E1010 09:24:57.661688 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:25:12 crc kubenswrapper[4732]: I1010 09:25:12.660432 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:25:12 crc kubenswrapper[4732]: E1010 09:25:12.661336 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:25:25 crc kubenswrapper[4732]: I1010 09:25:25.660640 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:25:25 crc kubenswrapper[4732]: E1010 09:25:25.661400 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:25:38 crc kubenswrapper[4732]: I1010 09:25:38.660309 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:25:38 crc kubenswrapper[4732]: E1010 09:25:38.661152 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:25:50 crc kubenswrapper[4732]: I1010 09:25:50.661607 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:25:50 crc kubenswrapper[4732]: E1010 09:25:50.663125 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:26:03 crc kubenswrapper[4732]: I1010 09:26:03.677615 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:26:03 crc kubenswrapper[4732]: E1010 09:26:03.679391 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.331850 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4vqls"] Oct 10 09:26:07 crc kubenswrapper[4732]: E1010 09:26:07.334255 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="extract-utilities" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.334274 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="extract-utilities" Oct 10 09:26:07 crc kubenswrapper[4732]: E1010 09:26:07.334303 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="registry-server" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.334312 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="registry-server" Oct 10 09:26:07 crc kubenswrapper[4732]: E1010 09:26:07.334356 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="extract-content" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.334365 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="extract-content" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.334626 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4911ee6f-f3d7-4589-bc20-f32c5f875595" containerName="registry-server" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.336994 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.359512 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vqls"] Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.438557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-utilities\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.438761 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6h8c\" (UniqueName: \"kubernetes.io/projected/94f63e3b-d792-4e4a-810a-eb13b9bc895d-kube-api-access-k6h8c\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.438809 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-catalog-content\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.540802 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-catalog-content\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.540880 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-utilities\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.541061 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6h8c\" (UniqueName: \"kubernetes.io/projected/94f63e3b-d792-4e4a-810a-eb13b9bc895d-kube-api-access-k6h8c\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.541351 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-utilities\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.541511 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-catalog-content\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.559770 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6h8c\" (UniqueName: \"kubernetes.io/projected/94f63e3b-d792-4e4a-810a-eb13b9bc895d-kube-api-access-k6h8c\") pod \"certified-operators-4vqls\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:07 crc kubenswrapper[4732]: I1010 09:26:07.668415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:08 crc kubenswrapper[4732]: W1010 09:26:08.182925 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f63e3b_d792_4e4a_810a_eb13b9bc895d.slice/crio-48cbaac303d7c05da2eb3ba30eadec5aba154d70373ba070a914451e519e2181 WatchSource:0}: Error finding container 48cbaac303d7c05da2eb3ba30eadec5aba154d70373ba070a914451e519e2181: Status 404 returned error can't find the container with id 48cbaac303d7c05da2eb3ba30eadec5aba154d70373ba070a914451e519e2181 Oct 10 09:26:08 crc kubenswrapper[4732]: I1010 09:26:08.183157 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4vqls"] Oct 10 09:26:08 crc kubenswrapper[4732]: I1010 09:26:08.454982 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerStarted","Data":"0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa"} Oct 10 09:26:08 crc kubenswrapper[4732]: I1010 09:26:08.455319 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerStarted","Data":"48cbaac303d7c05da2eb3ba30eadec5aba154d70373ba070a914451e519e2181"} Oct 10 09:26:08 crc kubenswrapper[4732]: I1010 09:26:08.458192 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:26:09 crc kubenswrapper[4732]: I1010 09:26:09.466371 4732 generic.go:334] "Generic (PLEG): container finished" podID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerID="0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa" exitCode=0 Oct 10 09:26:09 crc kubenswrapper[4732]: I1010 09:26:09.466464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerDied","Data":"0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa"} Oct 10 09:26:09 crc kubenswrapper[4732]: I1010 09:26:09.466798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerStarted","Data":"eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1"} Oct 10 09:26:10 crc kubenswrapper[4732]: I1010 09:26:10.479465 4732 generic.go:334] "Generic (PLEG): container finished" podID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerID="eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1" exitCode=0 Oct 10 09:26:10 crc kubenswrapper[4732]: I1010 09:26:10.479596 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerDied","Data":"eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1"} Oct 10 09:26:11 crc kubenswrapper[4732]: I1010 09:26:11.492046 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerStarted","Data":"d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563"} Oct 10 09:26:11 crc kubenswrapper[4732]: I1010 09:26:11.510752 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4vqls" podStartSLOduration=2.005239849 podStartE2EDuration="4.510737794s" podCreationTimestamp="2025-10-10 09:26:07 +0000 UTC" firstStartedPulling="2025-10-10 09:26:08.457554916 +0000 UTC m=+9295.527146197" lastFinishedPulling="2025-10-10 09:26:10.963052881 +0000 UTC m=+9298.032644142" observedRunningTime="2025-10-10 09:26:11.507913377 +0000 UTC m=+9298.577504628" watchObservedRunningTime="2025-10-10 09:26:11.510737794 +0000 UTC m=+9298.580329025" Oct 10 09:26:17 crc kubenswrapper[4732]: I1010 09:26:17.682363 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:17 crc kubenswrapper[4732]: I1010 09:26:17.683074 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:17 crc kubenswrapper[4732]: I1010 09:26:17.742043 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:18 crc kubenswrapper[4732]: I1010 09:26:18.619649 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:18 crc kubenswrapper[4732]: I1010 09:26:18.667767 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:26:18 crc kubenswrapper[4732]: E1010 09:26:18.668266 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:26:18 crc kubenswrapper[4732]: I1010 09:26:18.683414 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vqls"] Oct 10 09:26:20 crc kubenswrapper[4732]: I1010 09:26:20.591323 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4vqls" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="registry-server" containerID="cri-o://d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563" gracePeriod=2 Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.114182 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.233054 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-catalog-content\") pod \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.233314 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-utilities\") pod \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.233348 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6h8c\" (UniqueName: \"kubernetes.io/projected/94f63e3b-d792-4e4a-810a-eb13b9bc895d-kube-api-access-k6h8c\") pod \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\" (UID: \"94f63e3b-d792-4e4a-810a-eb13b9bc895d\") " Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.234716 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-utilities" (OuterVolumeSpecName: "utilities") pod "94f63e3b-d792-4e4a-810a-eb13b9bc895d" (UID: "94f63e3b-d792-4e4a-810a-eb13b9bc895d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.244412 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f63e3b-d792-4e4a-810a-eb13b9bc895d-kube-api-access-k6h8c" (OuterVolumeSpecName: "kube-api-access-k6h8c") pod "94f63e3b-d792-4e4a-810a-eb13b9bc895d" (UID: "94f63e3b-d792-4e4a-810a-eb13b9bc895d"). InnerVolumeSpecName "kube-api-access-k6h8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.292548 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94f63e3b-d792-4e4a-810a-eb13b9bc895d" (UID: "94f63e3b-d792-4e4a-810a-eb13b9bc895d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.336214 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.336498 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94f63e3b-d792-4e4a-810a-eb13b9bc895d-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.336619 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6h8c\" (UniqueName: \"kubernetes.io/projected/94f63e3b-d792-4e4a-810a-eb13b9bc895d-kube-api-access-k6h8c\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.604905 4732 generic.go:334] "Generic (PLEG): container finished" podID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerID="d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563" exitCode=0 Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.604954 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerDied","Data":"d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563"} Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.604982 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4vqls" event={"ID":"94f63e3b-d792-4e4a-810a-eb13b9bc895d","Type":"ContainerDied","Data":"48cbaac303d7c05da2eb3ba30eadec5aba154d70373ba070a914451e519e2181"} Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.605003 4732 scope.go:117] "RemoveContainer" containerID="d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.606197 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4vqls" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.636856 4732 scope.go:117] "RemoveContainer" containerID="eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.648215 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4vqls"] Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.677428 4732 scope.go:117] "RemoveContainer" containerID="0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.677899 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4vqls"] Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.707246 4732 scope.go:117] "RemoveContainer" containerID="d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563" Oct 10 09:26:21 crc kubenswrapper[4732]: E1010 09:26:21.708228 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563\": container with ID starting with d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563 not found: ID does not exist" containerID="d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.708285 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563"} err="failed to get container status \"d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563\": rpc error: code = NotFound desc = could not find container \"d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563\": container with ID starting with d4f1eba1fe6b6db603e76bf967fc36aedc1c013de955aba6e577da45ef6c0563 not found: ID does not exist" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.708314 4732 scope.go:117] "RemoveContainer" containerID="eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1" Oct 10 09:26:21 crc kubenswrapper[4732]: E1010 09:26:21.708684 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1\": container with ID starting with eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1 not found: ID does not exist" containerID="eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.708802 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1"} err="failed to get container status \"eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1\": rpc error: code = NotFound desc = could not find container \"eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1\": container with ID starting with eb4c63924f6dee622de58df6a887ef39b1bb21a0e3f5bbb6c45300c9419e25a1 not found: ID does not exist" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.708865 4732 scope.go:117] "RemoveContainer" containerID="0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa" Oct 10 09:26:21 crc kubenswrapper[4732]: E1010 09:26:21.709349 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa\": container with ID starting with 0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa not found: ID does not exist" containerID="0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa" Oct 10 09:26:21 crc kubenswrapper[4732]: I1010 09:26:21.709382 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa"} err="failed to get container status \"0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa\": rpc error: code = NotFound desc = could not find container \"0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa\": container with ID starting with 0795c13b5e7dca294bf63d323f993bed597f7de2f37fa97c24cef5ae2d676bfa not found: ID does not exist" Oct 10 09:26:21 crc kubenswrapper[4732]: E1010 09:26:21.728849 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f63e3b_d792_4e4a_810a_eb13b9bc895d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f63e3b_d792_4e4a_810a_eb13b9bc895d.slice/crio-48cbaac303d7c05da2eb3ba30eadec5aba154d70373ba070a914451e519e2181\": RecentStats: unable to find data in memory cache]" Oct 10 09:26:23 crc kubenswrapper[4732]: I1010 09:26:23.675673 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" path="/var/lib/kubelet/pods/94f63e3b-d792-4e4a-810a-eb13b9bc895d/volumes" Oct 10 09:26:29 crc kubenswrapper[4732]: I1010 09:26:29.661393 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:26:29 crc kubenswrapper[4732]: E1010 09:26:29.663637 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.342770 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-flsjt"] Oct 10 09:26:39 crc kubenswrapper[4732]: E1010 09:26:39.344256 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="extract-utilities" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.344284 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="extract-utilities" Oct 10 09:26:39 crc kubenswrapper[4732]: E1010 09:26:39.344325 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="registry-server" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.344333 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="registry-server" Oct 10 09:26:39 crc kubenswrapper[4732]: E1010 09:26:39.344361 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="extract-content" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.344367 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="extract-content" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.344616 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f63e3b-d792-4e4a-810a-eb13b9bc895d" containerName="registry-server" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.346255 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.361149 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flsjt"] Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.429467 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v9jq\" (UniqueName: \"kubernetes.io/projected/49765687-46a8-4ac7-b4a7-e0003a129e89-kube-api-access-8v9jq\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.430106 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-catalog-content\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.430240 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-utilities\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.532633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v9jq\" (UniqueName: \"kubernetes.io/projected/49765687-46a8-4ac7-b4a7-e0003a129e89-kube-api-access-8v9jq\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.533147 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-catalog-content\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.533305 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-utilities\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.533631 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-catalog-content\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.533870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-utilities\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.567887 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v9jq\" (UniqueName: \"kubernetes.io/projected/49765687-46a8-4ac7-b4a7-e0003a129e89-kube-api-access-8v9jq\") pod \"community-operators-flsjt\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:39 crc kubenswrapper[4732]: I1010 09:26:39.695352 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:40 crc kubenswrapper[4732]: I1010 09:26:40.183823 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-flsjt"] Oct 10 09:26:40 crc kubenswrapper[4732]: I1010 09:26:40.788480 4732 generic.go:334] "Generic (PLEG): container finished" podID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerID="7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc" exitCode=0 Oct 10 09:26:40 crc kubenswrapper[4732]: I1010 09:26:40.788819 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flsjt" event={"ID":"49765687-46a8-4ac7-b4a7-e0003a129e89","Type":"ContainerDied","Data":"7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc"} Oct 10 09:26:40 crc kubenswrapper[4732]: I1010 09:26:40.788849 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flsjt" event={"ID":"49765687-46a8-4ac7-b4a7-e0003a129e89","Type":"ContainerStarted","Data":"58cbb186e077028c8b47ae3a12d9fa22e07139aa58dccc55ccd0770455b5abc8"} Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.660820 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:26:41 crc kubenswrapper[4732]: E1010 09:26:41.661302 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.763332 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gr7xz"] Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.773127 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.818801 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr7xz"] Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.887123 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5w6\" (UniqueName: \"kubernetes.io/projected/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-kube-api-access-ts5w6\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.887185 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-catalog-content\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.887465 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-utilities\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.989941 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5w6\" (UniqueName: \"kubernetes.io/projected/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-kube-api-access-ts5w6\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.990032 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-catalog-content\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.990154 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-utilities\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.990743 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-utilities\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:41 crc kubenswrapper[4732]: I1010 09:26:41.990742 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-catalog-content\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:42 crc kubenswrapper[4732]: I1010 09:26:42.012620 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5w6\" (UniqueName: \"kubernetes.io/projected/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-kube-api-access-ts5w6\") pod \"redhat-marketplace-gr7xz\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:42 crc kubenswrapper[4732]: I1010 09:26:42.164248 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:42 crc kubenswrapper[4732]: I1010 09:26:42.676882 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr7xz"] Oct 10 09:26:42 crc kubenswrapper[4732]: I1010 09:26:42.845119 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr7xz" event={"ID":"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8","Type":"ContainerStarted","Data":"eceedb846461ccd414508c468ed6ae85c92f02cb90d0d58cff3662ed79efb717"} Oct 10 09:26:42 crc kubenswrapper[4732]: I1010 09:26:42.847925 4732 generic.go:334] "Generic (PLEG): container finished" podID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerID="43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396" exitCode=0 Oct 10 09:26:42 crc kubenswrapper[4732]: I1010 09:26:42.847974 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flsjt" event={"ID":"49765687-46a8-4ac7-b4a7-e0003a129e89","Type":"ContainerDied","Data":"43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396"} Oct 10 09:26:43 crc kubenswrapper[4732]: I1010 09:26:43.858615 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerID="74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c" exitCode=0 Oct 10 09:26:43 crc kubenswrapper[4732]: I1010 09:26:43.858840 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr7xz" event={"ID":"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8","Type":"ContainerDied","Data":"74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c"} Oct 10 09:26:43 crc kubenswrapper[4732]: I1010 09:26:43.862113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flsjt" event={"ID":"49765687-46a8-4ac7-b4a7-e0003a129e89","Type":"ContainerStarted","Data":"76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5"} Oct 10 09:26:43 crc kubenswrapper[4732]: I1010 09:26:43.905587 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-flsjt" podStartSLOduration=2.302769408 podStartE2EDuration="4.905563098s" podCreationTimestamp="2025-10-10 09:26:39 +0000 UTC" firstStartedPulling="2025-10-10 09:26:40.790500582 +0000 UTC m=+9327.860091843" lastFinishedPulling="2025-10-10 09:26:43.393294292 +0000 UTC m=+9330.462885533" observedRunningTime="2025-10-10 09:26:43.898380352 +0000 UTC m=+9330.967971653" watchObservedRunningTime="2025-10-10 09:26:43.905563098 +0000 UTC m=+9330.975154379" Oct 10 09:26:45 crc kubenswrapper[4732]: I1010 09:26:45.909557 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerID="9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd" exitCode=0 Oct 10 09:26:45 crc kubenswrapper[4732]: I1010 09:26:45.909645 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr7xz" event={"ID":"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8","Type":"ContainerDied","Data":"9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd"} Oct 10 09:26:47 crc kubenswrapper[4732]: I1010 09:26:47.934591 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr7xz" event={"ID":"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8","Type":"ContainerStarted","Data":"5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996"} Oct 10 09:26:47 crc kubenswrapper[4732]: I1010 09:26:47.960093 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gr7xz" podStartSLOduration=4.341998107 podStartE2EDuration="6.960070164s" podCreationTimestamp="2025-10-10 09:26:41 +0000 UTC" firstStartedPulling="2025-10-10 09:26:43.860824617 +0000 UTC m=+9330.930415858" lastFinishedPulling="2025-10-10 09:26:46.478896674 +0000 UTC m=+9333.548487915" observedRunningTime="2025-10-10 09:26:47.953363891 +0000 UTC m=+9335.022955142" watchObservedRunningTime="2025-10-10 09:26:47.960070164 +0000 UTC m=+9335.029661405" Oct 10 09:26:49 crc kubenswrapper[4732]: I1010 09:26:49.695839 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:49 crc kubenswrapper[4732]: I1010 09:26:49.696098 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:49 crc kubenswrapper[4732]: I1010 09:26:49.745165 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:50 crc kubenswrapper[4732]: I1010 09:26:50.019715 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:50 crc kubenswrapper[4732]: I1010 09:26:50.929439 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flsjt"] Oct 10 09:26:51 crc kubenswrapper[4732]: I1010 09:26:51.977978 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-flsjt" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="registry-server" containerID="cri-o://76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5" gracePeriod=2 Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.167636 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.167720 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.236124 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.503645 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.543943 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-utilities\") pod \"49765687-46a8-4ac7-b4a7-e0003a129e89\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.544079 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-catalog-content\") pod \"49765687-46a8-4ac7-b4a7-e0003a129e89\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.544128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v9jq\" (UniqueName: \"kubernetes.io/projected/49765687-46a8-4ac7-b4a7-e0003a129e89-kube-api-access-8v9jq\") pod \"49765687-46a8-4ac7-b4a7-e0003a129e89\" (UID: \"49765687-46a8-4ac7-b4a7-e0003a129e89\") " Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.548560 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-utilities" (OuterVolumeSpecName: "utilities") pod "49765687-46a8-4ac7-b4a7-e0003a129e89" (UID: "49765687-46a8-4ac7-b4a7-e0003a129e89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.551241 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49765687-46a8-4ac7-b4a7-e0003a129e89-kube-api-access-8v9jq" (OuterVolumeSpecName: "kube-api-access-8v9jq") pod "49765687-46a8-4ac7-b4a7-e0003a129e89" (UID: "49765687-46a8-4ac7-b4a7-e0003a129e89"). InnerVolumeSpecName "kube-api-access-8v9jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.646735 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v9jq\" (UniqueName: \"kubernetes.io/projected/49765687-46a8-4ac7-b4a7-e0003a129e89-kube-api-access-8v9jq\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.646787 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.995630 4732 generic.go:334] "Generic (PLEG): container finished" podID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerID="76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5" exitCode=0 Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.995672 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-flsjt" Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.995728 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flsjt" event={"ID":"49765687-46a8-4ac7-b4a7-e0003a129e89","Type":"ContainerDied","Data":"76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5"} Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.995774 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-flsjt" event={"ID":"49765687-46a8-4ac7-b4a7-e0003a129e89","Type":"ContainerDied","Data":"58cbb186e077028c8b47ae3a12d9fa22e07139aa58dccc55ccd0770455b5abc8"} Oct 10 09:26:52 crc kubenswrapper[4732]: I1010 09:26:52.995794 4732 scope.go:117] "RemoveContainer" containerID="76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.021657 4732 scope.go:117] "RemoveContainer" containerID="43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.044473 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.053745 4732 scope.go:117] "RemoveContainer" containerID="7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.109880 4732 scope.go:117] "RemoveContainer" containerID="76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5" Oct 10 09:26:53 crc kubenswrapper[4732]: E1010 09:26:53.111132 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5\": container with ID starting with 76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5 not found: ID does not exist" containerID="76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.111176 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5"} err="failed to get container status \"76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5\": rpc error: code = NotFound desc = could not find container \"76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5\": container with ID starting with 76fe0530532ea9eff95655cb525281368e2dbc383483760a333b21d6f8decfc5 not found: ID does not exist" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.111211 4732 scope.go:117] "RemoveContainer" containerID="43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396" Oct 10 09:26:53 crc kubenswrapper[4732]: E1010 09:26:53.111480 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396\": container with ID starting with 43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396 not found: ID does not exist" containerID="43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.111508 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396"} err="failed to get container status \"43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396\": rpc error: code = NotFound desc = could not find container \"43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396\": container with ID starting with 43d5d03f9c8ceb0fb960ecc784ac3fb3fd2a5ebc1cecea121675b37005f55396 not found: ID does not exist" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.111528 4732 scope.go:117] "RemoveContainer" containerID="7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc" Oct 10 09:26:53 crc kubenswrapper[4732]: E1010 09:26:53.111872 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc\": container with ID starting with 7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc not found: ID does not exist" containerID="7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.112066 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc"} err="failed to get container status \"7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc\": rpc error: code = NotFound desc = could not find container \"7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc\": container with ID starting with 7291cfe43e76fd6f465fa458c50365cb156e4e14f969fdfaaeeb63b4df64acbc not found: ID does not exist" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.241463 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49765687-46a8-4ac7-b4a7-e0003a129e89" (UID: "49765687-46a8-4ac7-b4a7-e0003a129e89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.269416 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49765687-46a8-4ac7-b4a7-e0003a129e89-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.334270 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-flsjt"] Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.343017 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-flsjt"] Oct 10 09:26:53 crc kubenswrapper[4732]: I1010 09:26:53.671155 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" path="/var/lib/kubelet/pods/49765687-46a8-4ac7-b4a7-e0003a129e89/volumes" Oct 10 09:26:54 crc kubenswrapper[4732]: I1010 09:26:54.527472 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr7xz"] Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.017870 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gr7xz" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="registry-server" containerID="cri-o://5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996" gracePeriod=2 Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.540053 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.718500 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-utilities\") pod \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.719054 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts5w6\" (UniqueName: \"kubernetes.io/projected/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-kube-api-access-ts5w6\") pod \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.719105 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-catalog-content\") pod \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\" (UID: \"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8\") " Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.721575 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-utilities" (OuterVolumeSpecName: "utilities") pod "c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" (UID: "c0bf757f-ebf9-4630-8487-c3b0bb22d4e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.725987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-kube-api-access-ts5w6" (OuterVolumeSpecName: "kube-api-access-ts5w6") pod "c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" (UID: "c0bf757f-ebf9-4630-8487-c3b0bb22d4e8"). InnerVolumeSpecName "kube-api-access-ts5w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.734728 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" (UID: "c0bf757f-ebf9-4630-8487-c3b0bb22d4e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.821936 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts5w6\" (UniqueName: \"kubernetes.io/projected/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-kube-api-access-ts5w6\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.821972 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:55 crc kubenswrapper[4732]: I1010 09:26:55.821985 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.035373 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerID="5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996" exitCode=0 Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.035417 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr7xz" event={"ID":"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8","Type":"ContainerDied","Data":"5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996"} Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.035449 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gr7xz" event={"ID":"c0bf757f-ebf9-4630-8487-c3b0bb22d4e8","Type":"ContainerDied","Data":"eceedb846461ccd414508c468ed6ae85c92f02cb90d0d58cff3662ed79efb717"} Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.035470 4732 scope.go:117] "RemoveContainer" containerID="5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.035479 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gr7xz" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.078947 4732 scope.go:117] "RemoveContainer" containerID="9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.081376 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr7xz"] Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.091030 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gr7xz"] Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.104142 4732 scope.go:117] "RemoveContainer" containerID="74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.157611 4732 scope.go:117] "RemoveContainer" containerID="5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996" Oct 10 09:26:56 crc kubenswrapper[4732]: E1010 09:26:56.158195 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996\": container with ID starting with 5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996 not found: ID does not exist" containerID="5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.158229 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996"} err="failed to get container status \"5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996\": rpc error: code = NotFound desc = could not find container \"5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996\": container with ID starting with 5eef9a9e4aa9977f041aa28e506e3f6d161f4e6c58d2c4912256dacd65ecb996 not found: ID does not exist" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.158249 4732 scope.go:117] "RemoveContainer" containerID="9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd" Oct 10 09:26:56 crc kubenswrapper[4732]: E1010 09:26:56.158495 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd\": container with ID starting with 9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd not found: ID does not exist" containerID="9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.158527 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd"} err="failed to get container status \"9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd\": rpc error: code = NotFound desc = could not find container \"9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd\": container with ID starting with 9de9a0f4524e095f294e77c377f82e9bba56e4918a5adec9effcfaa8c804bbbd not found: ID does not exist" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.158545 4732 scope.go:117] "RemoveContainer" containerID="74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c" Oct 10 09:26:56 crc kubenswrapper[4732]: E1010 09:26:56.158901 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c\": container with ID starting with 74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c not found: ID does not exist" containerID="74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.158927 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c"} err="failed to get container status \"74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c\": rpc error: code = NotFound desc = could not find container \"74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c\": container with ID starting with 74e5241abcec709cc282720aa954dead24a083a64682614c0b8a16569c552b1c not found: ID does not exist" Oct 10 09:26:56 crc kubenswrapper[4732]: I1010 09:26:56.661093 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:26:56 crc kubenswrapper[4732]: E1010 09:26:56.661607 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:26:57 crc kubenswrapper[4732]: I1010 09:26:57.672781 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" path="/var/lib/kubelet/pods/c0bf757f-ebf9-4630-8487-c3b0bb22d4e8/volumes" Oct 10 09:27:10 crc kubenswrapper[4732]: I1010 09:27:10.660930 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:27:10 crc kubenswrapper[4732]: E1010 09:27:10.661896 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:27:21 crc kubenswrapper[4732]: I1010 09:27:21.660234 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:27:21 crc kubenswrapper[4732]: E1010 09:27:21.661021 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:27:33 crc kubenswrapper[4732]: I1010 09:27:33.668504 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:27:34 crc kubenswrapper[4732]: I1010 09:27:34.433034 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"a53c21aa5b8c1256ccdd750142bf2a065909a0c2ac97b47316abc44ae8da38ca"} Oct 10 09:28:31 crc kubenswrapper[4732]: I1010 09:28:31.029472 4732 generic.go:334] "Generic (PLEG): container finished" podID="b0c399d3-48d7-4316-931f-2115e341ce3d" containerID="3c8941daa6e33058484f13eaf50cd0b410079b993357cd93d11c4a3fd5b4a6c7" exitCode=0 Oct 10 09:28:31 crc kubenswrapper[4732]: I1010 09:28:31.029617 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" event={"ID":"b0c399d3-48d7-4316-931f-2115e341ce3d","Type":"ContainerDied","Data":"3c8941daa6e33058484f13eaf50cd0b410079b993357cd93d11c4a3fd5b4a6c7"} Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.454907 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.496188 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-combined-ca-bundle\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.496241 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-1\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.498498 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cells-global-config-0\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.498544 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-1\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.498604 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-inventory\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.498647 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rhj5\" (UniqueName: \"kubernetes.io/projected/b0c399d3-48d7-4316-931f-2115e341ce3d-kube-api-access-8rhj5\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.498677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-0\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.498723 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-0\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.498835 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-ssh-key\") pod \"b0c399d3-48d7-4316-931f-2115e341ce3d\" (UID: \"b0c399d3-48d7-4316-931f-2115e341ce3d\") " Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.503524 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.506863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c399d3-48d7-4316-931f-2115e341ce3d-kube-api-access-8rhj5" (OuterVolumeSpecName: "kube-api-access-8rhj5") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "kube-api-access-8rhj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.527035 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-inventory" (OuterVolumeSpecName: "inventory") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.530686 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.534804 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.536891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.537795 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.545647 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.546107 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "b0c399d3-48d7-4316-931f-2115e341ce3d" (UID: "b0c399d3-48d7-4316-931f-2115e341ce3d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601328 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601363 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601374 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601384 4732 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601392 4732 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-inventory\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601403 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rhj5\" (UniqueName: \"kubernetes.io/projected/b0c399d3-48d7-4316-931f-2115e341ce3d-kube-api-access-8rhj5\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601412 4732 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601420 4732 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:32 crc kubenswrapper[4732]: I1010 09:28:32.601428 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0c399d3-48d7-4316-931f-2115e341ce3d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:28:33 crc kubenswrapper[4732]: I1010 09:28:33.047892 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" event={"ID":"b0c399d3-48d7-4316-931f-2115e341ce3d","Type":"ContainerDied","Data":"6cea1e38e8c2e405d835dd3611e995c5a168f49c05ccf41d2c315abd4ec8eab2"} Oct 10 09:28:33 crc kubenswrapper[4732]: I1010 09:28:33.048311 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cea1e38e8c2e405d835dd3611e995c5a168f49c05ccf41d2c315abd4ec8eab2" Oct 10 09:28:33 crc kubenswrapper[4732]: I1010 09:28:33.047932 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn" Oct 10 09:29:55 crc kubenswrapper[4732]: I1010 09:29:55.356717 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:29:55 crc kubenswrapper[4732]: I1010 09:29:55.357503 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.149879 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6"] Oct 10 09:30:00 crc kubenswrapper[4732]: E1010 09:30:00.151854 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="extract-content" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152072 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="extract-content" Oct 10 09:30:00 crc kubenswrapper[4732]: E1010 09:30:00.152155 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="registry-server" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152167 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="registry-server" Oct 10 09:30:00 crc kubenswrapper[4732]: E1010 09:30:00.152195 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="registry-server" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152203 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="registry-server" Oct 10 09:30:00 crc kubenswrapper[4732]: E1010 09:30:00.152223 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="extract-content" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152232 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="extract-content" Oct 10 09:30:00 crc kubenswrapper[4732]: E1010 09:30:00.152248 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="extract-utilities" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152257 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="extract-utilities" Oct 10 09:30:00 crc kubenswrapper[4732]: E1010 09:30:00.152294 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="extract-utilities" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152304 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="extract-utilities" Oct 10 09:30:00 crc kubenswrapper[4732]: E1010 09:30:00.152321 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c399d3-48d7-4316-931f-2115e341ce3d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152333 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c399d3-48d7-4316-931f-2115e341ce3d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152636 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="49765687-46a8-4ac7-b4a7-e0003a129e89" containerName="registry-server" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152666 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c399d3-48d7-4316-931f-2115e341ce3d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.152720 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0bf757f-ebf9-4630-8487-c3b0bb22d4e8" containerName="registry-server" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.154177 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.157734 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.157998 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.160169 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6"] Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.251723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vp6k\" (UniqueName: \"kubernetes.io/projected/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-kube-api-access-6vp6k\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.251915 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-config-volume\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.251951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-secret-volume\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.354108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vp6k\" (UniqueName: \"kubernetes.io/projected/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-kube-api-access-6vp6k\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.354395 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-config-volume\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.354448 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-secret-volume\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.356523 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-config-volume\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.709224 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-secret-volume\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.711023 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vp6k\" (UniqueName: \"kubernetes.io/projected/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-kube-api-access-6vp6k\") pod \"collect-profiles-29334810-zp9d6\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:00 crc kubenswrapper[4732]: I1010 09:30:00.781253 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:01 crc kubenswrapper[4732]: I1010 09:30:01.269184 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6"] Oct 10 09:30:01 crc kubenswrapper[4732]: I1010 09:30:01.967168 4732 generic.go:334] "Generic (PLEG): container finished" podID="a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c" containerID="5e78ef20adea674d4cbd76e39ef978498811b06446ad59ce3832373e429b22ec" exitCode=0 Oct 10 09:30:01 crc kubenswrapper[4732]: I1010 09:30:01.967502 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" event={"ID":"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c","Type":"ContainerDied","Data":"5e78ef20adea674d4cbd76e39ef978498811b06446ad59ce3832373e429b22ec"} Oct 10 09:30:01 crc kubenswrapper[4732]: I1010 09:30:01.967542 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" event={"ID":"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c","Type":"ContainerStarted","Data":"2fee9a33817a5a3267a5a67d50192dc96f875fb665ef746e36e1d55bd2148fc1"} Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.382815 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.520340 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-secret-volume\") pod \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.520497 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vp6k\" (UniqueName: \"kubernetes.io/projected/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-kube-api-access-6vp6k\") pod \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.520589 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-config-volume\") pod \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\" (UID: \"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c\") " Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.521672 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c" (UID: "a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.537818 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c" (UID: "a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.537975 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-kube-api-access-6vp6k" (OuterVolumeSpecName: "kube-api-access-6vp6k") pod "a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c" (UID: "a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c"). InnerVolumeSpecName "kube-api-access-6vp6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.623702 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.623762 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vp6k\" (UniqueName: \"kubernetes.io/projected/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-kube-api-access-6vp6k\") on node \"crc\" DevicePath \"\"" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.623775 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.992015 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" event={"ID":"a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c","Type":"ContainerDied","Data":"2fee9a33817a5a3267a5a67d50192dc96f875fb665ef746e36e1d55bd2148fc1"} Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.992065 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fee9a33817a5a3267a5a67d50192dc96f875fb665ef746e36e1d55bd2148fc1" Oct 10 09:30:03 crc kubenswrapper[4732]: I1010 09:30:03.992103 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334810-zp9d6" Oct 10 09:30:04 crc kubenswrapper[4732]: I1010 09:30:04.475898 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj"] Oct 10 09:30:04 crc kubenswrapper[4732]: I1010 09:30:04.485934 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334765-pbjwj"] Oct 10 09:30:05 crc kubenswrapper[4732]: I1010 09:30:05.672809 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62841533-3fab-4b0b-a51f-a9afff879bba" path="/var/lib/kubelet/pods/62841533-3fab-4b0b-a51f-a9afff879bba/volumes" Oct 10 09:30:20 crc kubenswrapper[4732]: I1010 09:30:20.178531 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 09:30:20 crc kubenswrapper[4732]: I1010 09:30:20.179296 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="ff596335-f475-49d2-8479-da5e2797ac5e" containerName="adoption" containerID="cri-o://250a4aa792ae89c51fcd8a699da9f73825467dd962d0f33aa86cc54f8513ecb4" gracePeriod=30 Oct 10 09:30:25 crc kubenswrapper[4732]: I1010 09:30:25.356145 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:30:25 crc kubenswrapper[4732]: I1010 09:30:25.356815 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:30:39 crc kubenswrapper[4732]: I1010 09:30:39.245594 4732 scope.go:117] "RemoveContainer" containerID="c89d6d547c7d69e0326ec4681c27ad5ee35f8741744e5688593dbd9022c799b1" Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.493738 4732 generic.go:334] "Generic (PLEG): container finished" podID="ff596335-f475-49d2-8479-da5e2797ac5e" containerID="250a4aa792ae89c51fcd8a699da9f73825467dd962d0f33aa86cc54f8513ecb4" exitCode=137 Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.493804 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ff596335-f475-49d2-8479-da5e2797ac5e","Type":"ContainerDied","Data":"250a4aa792ae89c51fcd8a699da9f73825467dd962d0f33aa86cc54f8513ecb4"} Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.706276 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.872216 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\") pod \"ff596335-f475-49d2-8479-da5e2797ac5e\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.872343 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc5jp\" (UniqueName: \"kubernetes.io/projected/ff596335-f475-49d2-8479-da5e2797ac5e-kube-api-access-zc5jp\") pod \"ff596335-f475-49d2-8479-da5e2797ac5e\" (UID: \"ff596335-f475-49d2-8479-da5e2797ac5e\") " Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.883239 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff596335-f475-49d2-8479-da5e2797ac5e-kube-api-access-zc5jp" (OuterVolumeSpecName: "kube-api-access-zc5jp") pod "ff596335-f475-49d2-8479-da5e2797ac5e" (UID: "ff596335-f475-49d2-8479-da5e2797ac5e"). InnerVolumeSpecName "kube-api-access-zc5jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.898769 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb" (OuterVolumeSpecName: "mariadb-data") pod "ff596335-f475-49d2-8479-da5e2797ac5e" (UID: "ff596335-f475-49d2-8479-da5e2797ac5e"). InnerVolumeSpecName "pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.974763 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\") on node \"crc\" " Oct 10 09:30:50 crc kubenswrapper[4732]: I1010 09:30:50.975207 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc5jp\" (UniqueName: \"kubernetes.io/projected/ff596335-f475-49d2-8479-da5e2797ac5e-kube-api-access-zc5jp\") on node \"crc\" DevicePath \"\"" Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.025150 4732 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.025382 4732 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb") on node "crc" Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.077619 4732 reconciler_common.go:293] "Volume detached for volume \"pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2523117a-3157-4f4c-a2b2-68c72e6433fb\") on node \"crc\" DevicePath \"\"" Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.506185 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"ff596335-f475-49d2-8479-da5e2797ac5e","Type":"ContainerDied","Data":"110fabd8a502e1f632bdfb473bc05a6fdf6c4ed69decbb61dcc231f1dcd7a460"} Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.506249 4732 scope.go:117] "RemoveContainer" containerID="250a4aa792ae89c51fcd8a699da9f73825467dd962d0f33aa86cc54f8513ecb4" Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.506353 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.568869 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.580920 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Oct 10 09:30:51 crc kubenswrapper[4732]: I1010 09:30:51.674870 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff596335-f475-49d2-8479-da5e2797ac5e" path="/var/lib/kubelet/pods/ff596335-f475-49d2-8479-da5e2797ac5e/volumes" Oct 10 09:30:52 crc kubenswrapper[4732]: I1010 09:30:52.229046 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 09:30:52 crc kubenswrapper[4732]: I1010 09:30:52.229617 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="5d21ef00-0975-464e-9ebf-c36b2e1c101e" containerName="adoption" containerID="cri-o://ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06" gracePeriod=30 Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.356976 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.357450 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.357520 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.358828 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a53c21aa5b8c1256ccdd750142bf2a065909a0c2ac97b47316abc44ae8da38ca"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.358945 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://a53c21aa5b8c1256ccdd750142bf2a065909a0c2ac97b47316abc44ae8da38ca" gracePeriod=600 Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.559862 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="a53c21aa5b8c1256ccdd750142bf2a065909a0c2ac97b47316abc44ae8da38ca" exitCode=0 Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.559924 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"a53c21aa5b8c1256ccdd750142bf2a065909a0c2ac97b47316abc44ae8da38ca"} Oct 10 09:30:55 crc kubenswrapper[4732]: I1010 09:30:55.559970 4732 scope.go:117] "RemoveContainer" containerID="d5c745c78db16ec5c08c590ff43aae583b7d4236868cf276f1bc3f5d9c8a544d" Oct 10 09:30:56 crc kubenswrapper[4732]: I1010 09:30:56.574091 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd"} Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.496474 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86bvh"] Oct 10 09:31:08 crc kubenswrapper[4732]: E1010 09:31:08.497443 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c" containerName="collect-profiles" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.497459 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c" containerName="collect-profiles" Oct 10 09:31:08 crc kubenswrapper[4732]: E1010 09:31:08.497494 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff596335-f475-49d2-8479-da5e2797ac5e" containerName="adoption" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.497500 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff596335-f475-49d2-8479-da5e2797ac5e" containerName="adoption" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.497867 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ffe78a-46f5-45a1-a0ec-5dbd0c43609c" containerName="collect-profiles" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.497877 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff596335-f475-49d2-8479-da5e2797ac5e" containerName="adoption" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.499370 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.517897 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86bvh"] Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.637219 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv6lc\" (UniqueName: \"kubernetes.io/projected/218389f0-4637-49f0-b063-c18740f214c5-kube-api-access-bv6lc\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.637384 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-catalog-content\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.637436 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-utilities\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.739387 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-catalog-content\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.739472 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-utilities\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.739600 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv6lc\" (UniqueName: \"kubernetes.io/projected/218389f0-4637-49f0-b063-c18740f214c5-kube-api-access-bv6lc\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.739931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-catalog-content\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.740557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-utilities\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.760434 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv6lc\" (UniqueName: \"kubernetes.io/projected/218389f0-4637-49f0-b063-c18740f214c5-kube-api-access-bv6lc\") pod \"redhat-operators-86bvh\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:08 crc kubenswrapper[4732]: I1010 09:31:08.820287 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:09 crc kubenswrapper[4732]: I1010 09:31:09.309160 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86bvh"] Oct 10 09:31:09 crc kubenswrapper[4732]: I1010 09:31:09.711929 4732 generic.go:334] "Generic (PLEG): container finished" podID="218389f0-4637-49f0-b063-c18740f214c5" containerID="e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd" exitCode=0 Oct 10 09:31:09 crc kubenswrapper[4732]: I1010 09:31:09.712007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86bvh" event={"ID":"218389f0-4637-49f0-b063-c18740f214c5","Type":"ContainerDied","Data":"e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd"} Oct 10 09:31:09 crc kubenswrapper[4732]: I1010 09:31:09.712289 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86bvh" event={"ID":"218389f0-4637-49f0-b063-c18740f214c5","Type":"ContainerStarted","Data":"07ac6089a3a9c29cf365b5b67ee1c6e507db72c3670988cab2e37fdbc830dd35"} Oct 10 09:31:09 crc kubenswrapper[4732]: I1010 09:31:09.714228 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:31:11 crc kubenswrapper[4732]: I1010 09:31:11.737221 4732 generic.go:334] "Generic (PLEG): container finished" podID="218389f0-4637-49f0-b063-c18740f214c5" containerID="7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae" exitCode=0 Oct 10 09:31:11 crc kubenswrapper[4732]: I1010 09:31:11.737329 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86bvh" event={"ID":"218389f0-4637-49f0-b063-c18740f214c5","Type":"ContainerDied","Data":"7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae"} Oct 10 09:31:12 crc kubenswrapper[4732]: I1010 09:31:12.751242 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86bvh" event={"ID":"218389f0-4637-49f0-b063-c18740f214c5","Type":"ContainerStarted","Data":"f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f"} Oct 10 09:31:12 crc kubenswrapper[4732]: I1010 09:31:12.776382 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86bvh" podStartSLOduration=2.3621529 podStartE2EDuration="4.776364855s" podCreationTimestamp="2025-10-10 09:31:08 +0000 UTC" firstStartedPulling="2025-10-10 09:31:09.713886394 +0000 UTC m=+9596.783477635" lastFinishedPulling="2025-10-10 09:31:12.128098339 +0000 UTC m=+9599.197689590" observedRunningTime="2025-10-10 09:31:12.770542236 +0000 UTC m=+9599.840133537" watchObservedRunningTime="2025-10-10 09:31:12.776364855 +0000 UTC m=+9599.845956096" Oct 10 09:31:18 crc kubenswrapper[4732]: I1010 09:31:18.821788 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:18 crc kubenswrapper[4732]: I1010 09:31:18.822355 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:18 crc kubenswrapper[4732]: I1010 09:31:18.876138 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:19 crc kubenswrapper[4732]: I1010 09:31:19.862651 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:19 crc kubenswrapper[4732]: I1010 09:31:19.906910 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86bvh"] Oct 10 09:31:21 crc kubenswrapper[4732]: I1010 09:31:21.836430 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-86bvh" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="registry-server" containerID="cri-o://f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f" gracePeriod=2 Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.453389 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.508976 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv6lc\" (UniqueName: \"kubernetes.io/projected/218389f0-4637-49f0-b063-c18740f214c5-kube-api-access-bv6lc\") pod \"218389f0-4637-49f0-b063-c18740f214c5\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.509081 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-utilities\") pod \"218389f0-4637-49f0-b063-c18740f214c5\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.509148 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-catalog-content\") pod \"218389f0-4637-49f0-b063-c18740f214c5\" (UID: \"218389f0-4637-49f0-b063-c18740f214c5\") " Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.509833 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-utilities" (OuterVolumeSpecName: "utilities") pod "218389f0-4637-49f0-b063-c18740f214c5" (UID: "218389f0-4637-49f0-b063-c18740f214c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.516888 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218389f0-4637-49f0-b063-c18740f214c5-kube-api-access-bv6lc" (OuterVolumeSpecName: "kube-api-access-bv6lc") pod "218389f0-4637-49f0-b063-c18740f214c5" (UID: "218389f0-4637-49f0-b063-c18740f214c5"). InnerVolumeSpecName "kube-api-access-bv6lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.606286 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "218389f0-4637-49f0-b063-c18740f214c5" (UID: "218389f0-4637-49f0-b063-c18740f214c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.611688 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.611737 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218389f0-4637-49f0-b063-c18740f214c5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.611753 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv6lc\" (UniqueName: \"kubernetes.io/projected/218389f0-4637-49f0-b063-c18740f214c5-kube-api-access-bv6lc\") on node \"crc\" DevicePath \"\"" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.624517 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.713623 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910\") pod \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.713927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsrlr\" (UniqueName: \"kubernetes.io/projected/5d21ef00-0975-464e-9ebf-c36b2e1c101e-kube-api-access-zsrlr\") pod \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.713963 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5d21ef00-0975-464e-9ebf-c36b2e1c101e-ovn-data-cert\") pod \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\" (UID: \"5d21ef00-0975-464e-9ebf-c36b2e1c101e\") " Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.718960 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d21ef00-0975-464e-9ebf-c36b2e1c101e-kube-api-access-zsrlr" (OuterVolumeSpecName: "kube-api-access-zsrlr") pod "5d21ef00-0975-464e-9ebf-c36b2e1c101e" (UID: "5d21ef00-0975-464e-9ebf-c36b2e1c101e"). InnerVolumeSpecName "kube-api-access-zsrlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.719348 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d21ef00-0975-464e-9ebf-c36b2e1c101e-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "5d21ef00-0975-464e-9ebf-c36b2e1c101e" (UID: "5d21ef00-0975-464e-9ebf-c36b2e1c101e"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.730924 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910" (OuterVolumeSpecName: "ovn-data") pod "5d21ef00-0975-464e-9ebf-c36b2e1c101e" (UID: "5d21ef00-0975-464e-9ebf-c36b2e1c101e"). InnerVolumeSpecName "pvc-e9bde394-af66-4806-b48a-581bdca4a910". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.816573 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsrlr\" (UniqueName: \"kubernetes.io/projected/5d21ef00-0975-464e-9ebf-c36b2e1c101e-kube-api-access-zsrlr\") on node \"crc\" DevicePath \"\"" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.816609 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/5d21ef00-0975-464e-9ebf-c36b2e1c101e-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.816645 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e9bde394-af66-4806-b48a-581bdca4a910\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910\") on node \"crc\" " Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.840917 4732 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.841185 4732 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e9bde394-af66-4806-b48a-581bdca4a910" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910") on node "crc" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.851790 4732 generic.go:334] "Generic (PLEG): container finished" podID="218389f0-4637-49f0-b063-c18740f214c5" containerID="f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f" exitCode=0 Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.851850 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86bvh" event={"ID":"218389f0-4637-49f0-b063-c18740f214c5","Type":"ContainerDied","Data":"f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f"} Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.851911 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86bvh" event={"ID":"218389f0-4637-49f0-b063-c18740f214c5","Type":"ContainerDied","Data":"07ac6089a3a9c29cf365b5b67ee1c6e507db72c3670988cab2e37fdbc830dd35"} Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.851944 4732 scope.go:117] "RemoveContainer" containerID="f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.852441 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86bvh" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.854904 4732 generic.go:334] "Generic (PLEG): container finished" podID="5d21ef00-0975-464e-9ebf-c36b2e1c101e" containerID="ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06" exitCode=137 Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.854962 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.854960 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5d21ef00-0975-464e-9ebf-c36b2e1c101e","Type":"ContainerDied","Data":"ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06"} Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.855105 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"5d21ef00-0975-464e-9ebf-c36b2e1c101e","Type":"ContainerDied","Data":"2da36ac2fb5ef4433921c4e1f1a7cc24d71611ab021d396dd9ef11c2d07ab0c0"} Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.884682 4732 scope.go:117] "RemoveContainer" containerID="7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.918709 4732 reconciler_common.go:293] "Volume detached for volume \"pvc-e9bde394-af66-4806-b48a-581bdca4a910\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9bde394-af66-4806-b48a-581bdca4a910\") on node \"crc\" DevicePath \"\"" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.919110 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86bvh"] Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.929567 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-86bvh"] Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.933058 4732 scope.go:117] "RemoveContainer" containerID="e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.937316 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.948246 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.963342 4732 scope.go:117] "RemoveContainer" containerID="f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f" Oct 10 09:31:22 crc kubenswrapper[4732]: E1010 09:31:22.963908 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f\": container with ID starting with f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f not found: ID does not exist" containerID="f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.963944 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f"} err="failed to get container status \"f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f\": rpc error: code = NotFound desc = could not find container \"f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f\": container with ID starting with f931a9bff77734e3c34f75eaa4e8b86a6b6473bc8a8f6ce5b7c733b8f25e244f not found: ID does not exist" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.963964 4732 scope.go:117] "RemoveContainer" containerID="7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae" Oct 10 09:31:22 crc kubenswrapper[4732]: E1010 09:31:22.964385 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae\": container with ID starting with 7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae not found: ID does not exist" containerID="7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.964410 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae"} err="failed to get container status \"7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae\": rpc error: code = NotFound desc = could not find container \"7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae\": container with ID starting with 7748bb828a339efa0c65c2ca9933109a6ef537ab433f007aaf5a1ada8cdbedae not found: ID does not exist" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.964428 4732 scope.go:117] "RemoveContainer" containerID="e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd" Oct 10 09:31:22 crc kubenswrapper[4732]: E1010 09:31:22.964699 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd\": container with ID starting with e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd not found: ID does not exist" containerID="e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.964720 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd"} err="failed to get container status \"e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd\": rpc error: code = NotFound desc = could not find container \"e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd\": container with ID starting with e71567be8281cc3d5f473c7ac40511408e65a5723a34cc45456cca687bab4ccd not found: ID does not exist" Oct 10 09:31:22 crc kubenswrapper[4732]: I1010 09:31:22.964733 4732 scope.go:117] "RemoveContainer" containerID="ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06" Oct 10 09:31:23 crc kubenswrapper[4732]: I1010 09:31:23.034576 4732 scope.go:117] "RemoveContainer" containerID="ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06" Oct 10 09:31:23 crc kubenswrapper[4732]: E1010 09:31:23.035023 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06\": container with ID starting with ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06 not found: ID does not exist" containerID="ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06" Oct 10 09:31:23 crc kubenswrapper[4732]: I1010 09:31:23.035061 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06"} err="failed to get container status \"ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06\": rpc error: code = NotFound desc = could not find container \"ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06\": container with ID starting with ab6a61beb9dc29bdc3ea636b8a4d18fe87eefe3e353c8786054ba829344cbb06 not found: ID does not exist" Oct 10 09:31:23 crc kubenswrapper[4732]: I1010 09:31:23.672913 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218389f0-4637-49f0-b063-c18740f214c5" path="/var/lib/kubelet/pods/218389f0-4637-49f0-b063-c18740f214c5/volumes" Oct 10 09:31:23 crc kubenswrapper[4732]: I1010 09:31:23.675483 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d21ef00-0975-464e-9ebf-c36b2e1c101e" path="/var/lib/kubelet/pods/5d21ef00-0975-464e-9ebf-c36b2e1c101e/volumes" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.928911 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 10 09:31:42 crc kubenswrapper[4732]: E1010 09:31:42.937710 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="extract-utilities" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.938092 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="extract-utilities" Oct 10 09:31:42 crc kubenswrapper[4732]: E1010 09:31:42.938118 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d21ef00-0975-464e-9ebf-c36b2e1c101e" containerName="adoption" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.938126 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d21ef00-0975-464e-9ebf-c36b2e1c101e" containerName="adoption" Oct 10 09:31:42 crc kubenswrapper[4732]: E1010 09:31:42.938163 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="extract-content" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.938171 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="extract-content" Oct 10 09:31:42 crc kubenswrapper[4732]: E1010 09:31:42.938203 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="registry-server" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.938211 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="registry-server" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.938519 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="218389f0-4637-49f0-b063-c18740f214c5" containerName="registry-server" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.938544 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d21ef00-0975-464e-9ebf-c36b2e1c101e" containerName="adoption" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.939327 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.948114 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.948212 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.948258 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.948331 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5ddsv" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.950267 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976104 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xrdj\" (UniqueName: \"kubernetes.io/projected/16c8a157-203f-48fc-a30d-bd652be267f5-kube-api-access-2xrdj\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976182 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976269 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976336 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976454 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976479 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-config-data\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:42 crc kubenswrapper[4732]: I1010 09:31:42.976545 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.078640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xrdj\" (UniqueName: \"kubernetes.io/projected/16c8a157-203f-48fc-a30d-bd652be267f5-kube-api-access-2xrdj\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.078785 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.078913 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.079169 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.079316 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.079434 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.079464 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.079502 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-config-data\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.079539 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.079710 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.080191 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.080825 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.080945 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.081511 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-config-data\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.086756 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.096859 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.098870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.100280 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xrdj\" (UniqueName: \"kubernetes.io/projected/16c8a157-203f-48fc-a30d-bd652be267f5-kube-api-access-2xrdj\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.124600 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.268924 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 10 09:31:43 crc kubenswrapper[4732]: I1010 09:31:43.751423 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 10 09:31:44 crc kubenswrapper[4732]: I1010 09:31:44.087478 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16c8a157-203f-48fc-a30d-bd652be267f5","Type":"ContainerStarted","Data":"2ad594bdf3dd4c086ddf916a19e9855f8315a7130690dbd9d1359141329751f8"} Oct 10 09:32:32 crc kubenswrapper[4732]: E1010 09:32:32.263971 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c4b77291aeca5591ac860bd4127cec2f" Oct 10 09:32:32 crc kubenswrapper[4732]: E1010 09:32:32.264522 4732 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c4b77291aeca5591ac860bd4127cec2f" Oct 10 09:32:32 crc kubenswrapper[4732]: E1010 09:32:32.264660 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c4b77291aeca5591ac860bd4127cec2f,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2xrdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(16c8a157-203f-48fc-a30d-bd652be267f5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 10 09:32:32 crc kubenswrapper[4732]: E1010 09:32:32.265916 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="16c8a157-203f-48fc-a30d-bd652be267f5" Oct 10 09:32:32 crc kubenswrapper[4732]: E1010 09:32:32.630646 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:c4b77291aeca5591ac860bd4127cec2f\\\"\"" pod="openstack/tempest-tests-tempest" podUID="16c8a157-203f-48fc-a30d-bd652be267f5" Oct 10 09:32:46 crc kubenswrapper[4732]: I1010 09:32:46.840651 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 10 09:32:48 crc kubenswrapper[4732]: I1010 09:32:48.794400 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16c8a157-203f-48fc-a30d-bd652be267f5","Type":"ContainerStarted","Data":"38cdcca4b50306991716f988e4d6d45b4116850c7a5af4185206b3bbac3def11"} Oct 10 09:32:48 crc kubenswrapper[4732]: I1010 09:32:48.816095 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.795665524 podStartE2EDuration="1m7.816073361s" podCreationTimestamp="2025-10-10 09:31:41 +0000 UTC" firstStartedPulling="2025-10-10 09:31:43.818126202 +0000 UTC m=+9630.887717453" lastFinishedPulling="2025-10-10 09:32:46.838534059 +0000 UTC m=+9693.908125290" observedRunningTime="2025-10-10 09:32:48.812870944 +0000 UTC m=+9695.882462215" watchObservedRunningTime="2025-10-10 09:32:48.816073361 +0000 UTC m=+9695.885664602" Oct 10 09:32:55 crc kubenswrapper[4732]: I1010 09:32:55.356274 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:32:55 crc kubenswrapper[4732]: I1010 09:32:55.356808 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:33:25 crc kubenswrapper[4732]: I1010 09:33:25.356177 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:33:25 crc kubenswrapper[4732]: I1010 09:33:25.356582 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.355653 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.356392 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.356567 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.357665 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.357780 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" gracePeriod=600 Oct 10 09:33:55 crc kubenswrapper[4732]: E1010 09:33:55.491216 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.492822 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" exitCode=0 Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.492891 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd"} Oct 10 09:33:55 crc kubenswrapper[4732]: I1010 09:33:55.493158 4732 scope.go:117] "RemoveContainer" containerID="a53c21aa5b8c1256ccdd750142bf2a065909a0c2ac97b47316abc44ae8da38ca" Oct 10 09:33:56 crc kubenswrapper[4732]: I1010 09:33:56.505222 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:33:56 crc kubenswrapper[4732]: E1010 09:33:56.505971 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:34:08 crc kubenswrapper[4732]: I1010 09:34:08.660208 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:34:08 crc kubenswrapper[4732]: E1010 09:34:08.661036 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:34:23 crc kubenswrapper[4732]: I1010 09:34:23.668334 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:34:23 crc kubenswrapper[4732]: E1010 09:34:23.669170 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:34:38 crc kubenswrapper[4732]: I1010 09:34:38.660339 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:34:38 crc kubenswrapper[4732]: E1010 09:34:38.661271 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:34:53 crc kubenswrapper[4732]: I1010 09:34:53.669571 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:34:53 crc kubenswrapper[4732]: E1010 09:34:53.670320 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:35:06 crc kubenswrapper[4732]: I1010 09:35:06.660939 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:35:06 crc kubenswrapper[4732]: E1010 09:35:06.661834 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:35:19 crc kubenswrapper[4732]: I1010 09:35:19.660873 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:35:19 crc kubenswrapper[4732]: E1010 09:35:19.663173 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:35:31 crc kubenswrapper[4732]: I1010 09:35:31.660857 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:35:31 crc kubenswrapper[4732]: E1010 09:35:31.662354 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:35:44 crc kubenswrapper[4732]: I1010 09:35:44.660889 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:35:44 crc kubenswrapper[4732]: E1010 09:35:44.661775 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:35:56 crc kubenswrapper[4732]: I1010 09:35:56.660194 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:35:56 crc kubenswrapper[4732]: E1010 09:35:56.660804 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:36:11 crc kubenswrapper[4732]: I1010 09:36:11.660738 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:36:11 crc kubenswrapper[4732]: E1010 09:36:11.661578 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:36:26 crc kubenswrapper[4732]: I1010 09:36:26.660600 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:36:26 crc kubenswrapper[4732]: E1010 09:36:26.661625 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:36:39 crc kubenswrapper[4732]: I1010 09:36:39.661349 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:36:39 crc kubenswrapper[4732]: E1010 09:36:39.662163 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:36:53 crc kubenswrapper[4732]: I1010 09:36:53.667856 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:36:53 crc kubenswrapper[4732]: E1010 09:36:53.668911 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:37:06 crc kubenswrapper[4732]: I1010 09:37:06.660929 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:37:06 crc kubenswrapper[4732]: E1010 09:37:06.661937 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:37:19 crc kubenswrapper[4732]: I1010 09:37:19.660787 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:37:19 crc kubenswrapper[4732]: E1010 09:37:19.663179 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:37:33 crc kubenswrapper[4732]: I1010 09:37:33.667287 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:37:33 crc kubenswrapper[4732]: E1010 09:37:33.668096 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:37:45 crc kubenswrapper[4732]: I1010 09:37:45.660440 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:37:45 crc kubenswrapper[4732]: E1010 09:37:45.661287 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:37:59 crc kubenswrapper[4732]: I1010 09:37:59.659961 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:37:59 crc kubenswrapper[4732]: E1010 09:37:59.660825 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:38:03 crc kubenswrapper[4732]: I1010 09:38:03.946769 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8qfrs"] Oct 10 09:38:03 crc kubenswrapper[4732]: I1010 09:38:03.949279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:03 crc kubenswrapper[4732]: I1010 09:38:03.977155 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qfrs"] Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.019268 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nstsd\" (UniqueName: \"kubernetes.io/projected/b3169162-5742-468e-a7e8-37edb8e083ac-kube-api-access-nstsd\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.019328 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-catalog-content\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.019399 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-utilities\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.120682 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-utilities\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.121189 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nstsd\" (UniqueName: \"kubernetes.io/projected/b3169162-5742-468e-a7e8-37edb8e083ac-kube-api-access-nstsd\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.121331 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-catalog-content\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.121341 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-utilities\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.121557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-catalog-content\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.144685 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nstsd\" (UniqueName: \"kubernetes.io/projected/b3169162-5742-468e-a7e8-37edb8e083ac-kube-api-access-nstsd\") pod \"community-operators-8qfrs\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:04 crc kubenswrapper[4732]: I1010 09:38:04.288660 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:05 crc kubenswrapper[4732]: I1010 09:38:05.181679 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qfrs"] Oct 10 09:38:05 crc kubenswrapper[4732]: I1010 09:38:05.892711 4732 generic.go:334] "Generic (PLEG): container finished" podID="b3169162-5742-468e-a7e8-37edb8e083ac" containerID="41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2" exitCode=0 Oct 10 09:38:05 crc kubenswrapper[4732]: I1010 09:38:05.893062 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qfrs" event={"ID":"b3169162-5742-468e-a7e8-37edb8e083ac","Type":"ContainerDied","Data":"41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2"} Oct 10 09:38:05 crc kubenswrapper[4732]: I1010 09:38:05.893091 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qfrs" event={"ID":"b3169162-5742-468e-a7e8-37edb8e083ac","Type":"ContainerStarted","Data":"38f3d583fe3f87c282937c1e846830b423821f2403d9ba02f6ab14c87538d81e"} Oct 10 09:38:05 crc kubenswrapper[4732]: I1010 09:38:05.895407 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:38:06 crc kubenswrapper[4732]: I1010 09:38:06.905781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qfrs" event={"ID":"b3169162-5742-468e-a7e8-37edb8e083ac","Type":"ContainerStarted","Data":"6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12"} Oct 10 09:38:08 crc kubenswrapper[4732]: I1010 09:38:08.927466 4732 generic.go:334] "Generic (PLEG): container finished" podID="b3169162-5742-468e-a7e8-37edb8e083ac" containerID="6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12" exitCode=0 Oct 10 09:38:08 crc kubenswrapper[4732]: I1010 09:38:08.927548 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qfrs" event={"ID":"b3169162-5742-468e-a7e8-37edb8e083ac","Type":"ContainerDied","Data":"6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12"} Oct 10 09:38:09 crc kubenswrapper[4732]: I1010 09:38:09.939954 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qfrs" event={"ID":"b3169162-5742-468e-a7e8-37edb8e083ac","Type":"ContainerStarted","Data":"4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83"} Oct 10 09:38:09 crc kubenswrapper[4732]: I1010 09:38:09.967957 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8qfrs" podStartSLOduration=3.360887075 podStartE2EDuration="6.96793518s" podCreationTimestamp="2025-10-10 09:38:03 +0000 UTC" firstStartedPulling="2025-10-10 09:38:05.895198949 +0000 UTC m=+10012.964790180" lastFinishedPulling="2025-10-10 09:38:09.502247044 +0000 UTC m=+10016.571838285" observedRunningTime="2025-10-10 09:38:09.959400489 +0000 UTC m=+10017.028991760" watchObservedRunningTime="2025-10-10 09:38:09.96793518 +0000 UTC m=+10017.037526441" Oct 10 09:38:14 crc kubenswrapper[4732]: I1010 09:38:14.289896 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:14 crc kubenswrapper[4732]: I1010 09:38:14.290524 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:14 crc kubenswrapper[4732]: I1010 09:38:14.354459 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:14 crc kubenswrapper[4732]: I1010 09:38:14.660499 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:38:14 crc kubenswrapper[4732]: E1010 09:38:14.660774 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:38:15 crc kubenswrapper[4732]: I1010 09:38:15.041188 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:15 crc kubenswrapper[4732]: I1010 09:38:15.097732 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qfrs"] Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.001616 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8qfrs" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="registry-server" containerID="cri-o://4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83" gracePeriod=2 Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.642341 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.812565 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-catalog-content\") pod \"b3169162-5742-468e-a7e8-37edb8e083ac\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.812791 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nstsd\" (UniqueName: \"kubernetes.io/projected/b3169162-5742-468e-a7e8-37edb8e083ac-kube-api-access-nstsd\") pod \"b3169162-5742-468e-a7e8-37edb8e083ac\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.812863 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-utilities\") pod \"b3169162-5742-468e-a7e8-37edb8e083ac\" (UID: \"b3169162-5742-468e-a7e8-37edb8e083ac\") " Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.814658 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-utilities" (OuterVolumeSpecName: "utilities") pod "b3169162-5742-468e-a7e8-37edb8e083ac" (UID: "b3169162-5742-468e-a7e8-37edb8e083ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.828341 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3169162-5742-468e-a7e8-37edb8e083ac-kube-api-access-nstsd" (OuterVolumeSpecName: "kube-api-access-nstsd") pod "b3169162-5742-468e-a7e8-37edb8e083ac" (UID: "b3169162-5742-468e-a7e8-37edb8e083ac"). InnerVolumeSpecName "kube-api-access-nstsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.871628 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3169162-5742-468e-a7e8-37edb8e083ac" (UID: "b3169162-5742-468e-a7e8-37edb8e083ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.916206 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nstsd\" (UniqueName: \"kubernetes.io/projected/b3169162-5742-468e-a7e8-37edb8e083ac-kube-api-access-nstsd\") on node \"crc\" DevicePath \"\"" Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.916273 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:38:17 crc kubenswrapper[4732]: I1010 09:38:17.916287 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3169162-5742-468e-a7e8-37edb8e083ac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.013000 4732 generic.go:334] "Generic (PLEG): container finished" podID="b3169162-5742-468e-a7e8-37edb8e083ac" containerID="4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83" exitCode=0 Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.013041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qfrs" event={"ID":"b3169162-5742-468e-a7e8-37edb8e083ac","Type":"ContainerDied","Data":"4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83"} Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.013075 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qfrs" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.013111 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qfrs" event={"ID":"b3169162-5742-468e-a7e8-37edb8e083ac","Type":"ContainerDied","Data":"38f3d583fe3f87c282937c1e846830b423821f2403d9ba02f6ab14c87538d81e"} Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.013135 4732 scope.go:117] "RemoveContainer" containerID="4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.062854 4732 scope.go:117] "RemoveContainer" containerID="6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.074263 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qfrs"] Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.081060 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8qfrs"] Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.120871 4732 scope.go:117] "RemoveContainer" containerID="41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.161665 4732 scope.go:117] "RemoveContainer" containerID="4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83" Oct 10 09:38:18 crc kubenswrapper[4732]: E1010 09:38:18.162516 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83\": container with ID starting with 4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83 not found: ID does not exist" containerID="4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.162569 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83"} err="failed to get container status \"4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83\": rpc error: code = NotFound desc = could not find container \"4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83\": container with ID starting with 4d167858e3504f3974b7aca5013401247e0794ebaabfb740bb55a77649d23e83 not found: ID does not exist" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.162601 4732 scope.go:117] "RemoveContainer" containerID="6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12" Oct 10 09:38:18 crc kubenswrapper[4732]: E1010 09:38:18.163170 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12\": container with ID starting with 6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12 not found: ID does not exist" containerID="6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.163200 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12"} err="failed to get container status \"6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12\": rpc error: code = NotFound desc = could not find container \"6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12\": container with ID starting with 6de7f175453bcc6df9d6a530ec9f385ecbaef6b6093e0d8a220b601bd0e29f12 not found: ID does not exist" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.163220 4732 scope.go:117] "RemoveContainer" containerID="41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2" Oct 10 09:38:18 crc kubenswrapper[4732]: E1010 09:38:18.163411 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2\": container with ID starting with 41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2 not found: ID does not exist" containerID="41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2" Oct 10 09:38:18 crc kubenswrapper[4732]: I1010 09:38:18.163429 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2"} err="failed to get container status \"41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2\": rpc error: code = NotFound desc = could not find container \"41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2\": container with ID starting with 41ddc1ef9ac378d4ce8723a39e89749e165739133a0945ee82ac32876d58b9c2 not found: ID does not exist" Oct 10 09:38:19 crc kubenswrapper[4732]: I1010 09:38:19.671491 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" path="/var/lib/kubelet/pods/b3169162-5742-468e-a7e8-37edb8e083ac/volumes" Oct 10 09:38:28 crc kubenswrapper[4732]: I1010 09:38:28.660082 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:38:28 crc kubenswrapper[4732]: E1010 09:38:28.660755 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:38:41 crc kubenswrapper[4732]: I1010 09:38:41.660127 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:38:41 crc kubenswrapper[4732]: E1010 09:38:41.660953 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:38:52 crc kubenswrapper[4732]: I1010 09:38:52.660267 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:38:52 crc kubenswrapper[4732]: E1010 09:38:52.661191 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:39:07 crc kubenswrapper[4732]: I1010 09:39:07.668773 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:39:08 crc kubenswrapper[4732]: I1010 09:39:08.469619 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"b398c30a8dc125d1670d9a1305c808485ecb86a3d8e24829aac9c1710bfd803b"} Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.808407 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t5fr7"] Oct 10 09:40:34 crc kubenswrapper[4732]: E1010 09:40:34.809293 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="extract-utilities" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.809310 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="extract-utilities" Oct 10 09:40:34 crc kubenswrapper[4732]: E1010 09:40:34.809372 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="registry-server" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.809380 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="registry-server" Oct 10 09:40:34 crc kubenswrapper[4732]: E1010 09:40:34.809394 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="extract-content" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.809401 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="extract-content" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.809630 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3169162-5742-468e-a7e8-37edb8e083ac" containerName="registry-server" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.811270 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.822260 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5fr7"] Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.913333 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f648g\" (UniqueName: \"kubernetes.io/projected/56179d94-92b8-4d29-ad44-3a09ffed2d09-kube-api-access-f648g\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.913436 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-utilities\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:34 crc kubenswrapper[4732]: I1010 09:40:34.913479 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-catalog-content\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.015505 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f648g\" (UniqueName: \"kubernetes.io/projected/56179d94-92b8-4d29-ad44-3a09ffed2d09-kube-api-access-f648g\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.015612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-utilities\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.015667 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-catalog-content\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.017957 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-catalog-content\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.018527 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-utilities\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.061268 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f648g\" (UniqueName: \"kubernetes.io/projected/56179d94-92b8-4d29-ad44-3a09ffed2d09-kube-api-access-f648g\") pod \"redhat-marketplace-t5fr7\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.142375 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.406352 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9kptw"] Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.409471 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.423247 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-catalog-content\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.423302 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9f7\" (UniqueName: \"kubernetes.io/projected/e8664d3f-7089-4ad9-8cef-6f08cff48999-kube-api-access-ln9f7\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.423326 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-utilities\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.426388 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kptw"] Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.525093 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-catalog-content\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.525173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9f7\" (UniqueName: \"kubernetes.io/projected/e8664d3f-7089-4ad9-8cef-6f08cff48999-kube-api-access-ln9f7\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.525211 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-utilities\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.529286 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-utilities\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.529956 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-catalog-content\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.618734 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5fr7"] Oct 10 09:40:35 crc kubenswrapper[4732]: I1010 09:40:35.810358 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9f7\" (UniqueName: \"kubernetes.io/projected/e8664d3f-7089-4ad9-8cef-6f08cff48999-kube-api-access-ln9f7\") pod \"certified-operators-9kptw\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:36 crc kubenswrapper[4732]: I1010 09:40:36.065836 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:36 crc kubenswrapper[4732]: I1010 09:40:36.361092 4732 generic.go:334] "Generic (PLEG): container finished" podID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerID="4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc" exitCode=0 Oct 10 09:40:36 crc kubenswrapper[4732]: I1010 09:40:36.361460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5fr7" event={"ID":"56179d94-92b8-4d29-ad44-3a09ffed2d09","Type":"ContainerDied","Data":"4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc"} Oct 10 09:40:36 crc kubenswrapper[4732]: I1010 09:40:36.361488 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5fr7" event={"ID":"56179d94-92b8-4d29-ad44-3a09ffed2d09","Type":"ContainerStarted","Data":"3c610a18441d1ae9db025378a7f795444c1de0d64d78ff4f66cc06fb7ebc37b4"} Oct 10 09:40:36 crc kubenswrapper[4732]: I1010 09:40:36.625815 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9kptw"] Oct 10 09:40:37 crc kubenswrapper[4732]: I1010 09:40:37.376647 4732 generic.go:334] "Generic (PLEG): container finished" podID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerID="c3698c1913aed908efc46a192154285d3c5d197fbc05b25d1a8428166c603b53" exitCode=0 Oct 10 09:40:37 crc kubenswrapper[4732]: I1010 09:40:37.377055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kptw" event={"ID":"e8664d3f-7089-4ad9-8cef-6f08cff48999","Type":"ContainerDied","Data":"c3698c1913aed908efc46a192154285d3c5d197fbc05b25d1a8428166c603b53"} Oct 10 09:40:37 crc kubenswrapper[4732]: I1010 09:40:37.377097 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kptw" event={"ID":"e8664d3f-7089-4ad9-8cef-6f08cff48999","Type":"ContainerStarted","Data":"be2e5e921bb12a96c4b52b294fd4924315bbf1acbd54ba747769c36da16c0a09"} Oct 10 09:40:38 crc kubenswrapper[4732]: I1010 09:40:38.391108 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kptw" event={"ID":"e8664d3f-7089-4ad9-8cef-6f08cff48999","Type":"ContainerStarted","Data":"7efc2ca95fcc28b087682d86f9d751dae3caf465e24d248752f52ced7b6a54c2"} Oct 10 09:40:38 crc kubenswrapper[4732]: I1010 09:40:38.395400 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5fr7" event={"ID":"56179d94-92b8-4d29-ad44-3a09ffed2d09","Type":"ContainerStarted","Data":"401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba"} Oct 10 09:40:39 crc kubenswrapper[4732]: I1010 09:40:39.410972 4732 generic.go:334] "Generic (PLEG): container finished" podID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerID="401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba" exitCode=0 Oct 10 09:40:39 crc kubenswrapper[4732]: I1010 09:40:39.411093 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5fr7" event={"ID":"56179d94-92b8-4d29-ad44-3a09ffed2d09","Type":"ContainerDied","Data":"401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba"} Oct 10 09:40:39 crc kubenswrapper[4732]: I1010 09:40:39.415418 4732 generic.go:334] "Generic (PLEG): container finished" podID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerID="7efc2ca95fcc28b087682d86f9d751dae3caf465e24d248752f52ced7b6a54c2" exitCode=0 Oct 10 09:40:39 crc kubenswrapper[4732]: I1010 09:40:39.415487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kptw" event={"ID":"e8664d3f-7089-4ad9-8cef-6f08cff48999","Type":"ContainerDied","Data":"7efc2ca95fcc28b087682d86f9d751dae3caf465e24d248752f52ced7b6a54c2"} Oct 10 09:40:40 crc kubenswrapper[4732]: I1010 09:40:40.428705 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kptw" event={"ID":"e8664d3f-7089-4ad9-8cef-6f08cff48999","Type":"ContainerStarted","Data":"37544c013329ad84fb4e67dae819b9ed31475661a99de83b28a40dede9477c9c"} Oct 10 09:40:40 crc kubenswrapper[4732]: I1010 09:40:40.431499 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5fr7" event={"ID":"56179d94-92b8-4d29-ad44-3a09ffed2d09","Type":"ContainerStarted","Data":"b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8"} Oct 10 09:40:40 crc kubenswrapper[4732]: I1010 09:40:40.455644 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9kptw" podStartSLOduration=3.027250025 podStartE2EDuration="5.45562374s" podCreationTimestamp="2025-10-10 09:40:35 +0000 UTC" firstStartedPulling="2025-10-10 09:40:37.379294072 +0000 UTC m=+10164.448885343" lastFinishedPulling="2025-10-10 09:40:39.807667817 +0000 UTC m=+10166.877259058" observedRunningTime="2025-10-10 09:40:40.451056357 +0000 UTC m=+10167.520647608" watchObservedRunningTime="2025-10-10 09:40:40.45562374 +0000 UTC m=+10167.525214981" Oct 10 09:40:40 crc kubenswrapper[4732]: I1010 09:40:40.471518 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t5fr7" podStartSLOduration=3.021674054 podStartE2EDuration="6.47149891s" podCreationTimestamp="2025-10-10 09:40:34 +0000 UTC" firstStartedPulling="2025-10-10 09:40:36.367807811 +0000 UTC m=+10163.437399052" lastFinishedPulling="2025-10-10 09:40:39.817632627 +0000 UTC m=+10166.887223908" observedRunningTime="2025-10-10 09:40:40.469057804 +0000 UTC m=+10167.538649055" watchObservedRunningTime="2025-10-10 09:40:40.47149891 +0000 UTC m=+10167.541090161" Oct 10 09:40:45 crc kubenswrapper[4732]: I1010 09:40:45.143058 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:45 crc kubenswrapper[4732]: I1010 09:40:45.143786 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:45 crc kubenswrapper[4732]: I1010 09:40:45.198873 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:45 crc kubenswrapper[4732]: I1010 09:40:45.542152 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:45 crc kubenswrapper[4732]: I1010 09:40:45.589862 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5fr7"] Oct 10 09:40:46 crc kubenswrapper[4732]: I1010 09:40:46.066833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:46 crc kubenswrapper[4732]: I1010 09:40:46.066894 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:46 crc kubenswrapper[4732]: I1010 09:40:46.123430 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:47 crc kubenswrapper[4732]: I1010 09:40:47.352726 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:47 crc kubenswrapper[4732]: I1010 09:40:47.494887 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t5fr7" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="registry-server" containerID="cri-o://b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8" gracePeriod=2 Oct 10 09:40:47 crc kubenswrapper[4732]: I1010 09:40:47.845281 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kptw"] Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.121035 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.198465 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f648g\" (UniqueName: \"kubernetes.io/projected/56179d94-92b8-4d29-ad44-3a09ffed2d09-kube-api-access-f648g\") pod \"56179d94-92b8-4d29-ad44-3a09ffed2d09\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.198643 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-catalog-content\") pod \"56179d94-92b8-4d29-ad44-3a09ffed2d09\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.198735 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-utilities\") pod \"56179d94-92b8-4d29-ad44-3a09ffed2d09\" (UID: \"56179d94-92b8-4d29-ad44-3a09ffed2d09\") " Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.199549 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-utilities" (OuterVolumeSpecName: "utilities") pod "56179d94-92b8-4d29-ad44-3a09ffed2d09" (UID: "56179d94-92b8-4d29-ad44-3a09ffed2d09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.207353 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56179d94-92b8-4d29-ad44-3a09ffed2d09-kube-api-access-f648g" (OuterVolumeSpecName: "kube-api-access-f648g") pod "56179d94-92b8-4d29-ad44-3a09ffed2d09" (UID: "56179d94-92b8-4d29-ad44-3a09ffed2d09"). InnerVolumeSpecName "kube-api-access-f648g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.235117 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56179d94-92b8-4d29-ad44-3a09ffed2d09" (UID: "56179d94-92b8-4d29-ad44-3a09ffed2d09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.300930 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f648g\" (UniqueName: \"kubernetes.io/projected/56179d94-92b8-4d29-ad44-3a09ffed2d09-kube-api-access-f648g\") on node \"crc\" DevicePath \"\"" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.300966 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.300977 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56179d94-92b8-4d29-ad44-3a09ffed2d09-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.506407 4732 generic.go:334] "Generic (PLEG): container finished" podID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerID="b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8" exitCode=0 Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.506484 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t5fr7" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.506477 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5fr7" event={"ID":"56179d94-92b8-4d29-ad44-3a09ffed2d09","Type":"ContainerDied","Data":"b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8"} Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.506562 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t5fr7" event={"ID":"56179d94-92b8-4d29-ad44-3a09ffed2d09","Type":"ContainerDied","Data":"3c610a18441d1ae9db025378a7f795444c1de0d64d78ff4f66cc06fb7ebc37b4"} Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.506593 4732 scope.go:117] "RemoveContainer" containerID="b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.506612 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9kptw" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="registry-server" containerID="cri-o://37544c013329ad84fb4e67dae819b9ed31475661a99de83b28a40dede9477c9c" gracePeriod=2 Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.532350 4732 scope.go:117] "RemoveContainer" containerID="401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba" Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.553508 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5fr7"] Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.566108 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t5fr7"] Oct 10 09:40:48 crc kubenswrapper[4732]: I1010 09:40:48.574622 4732 scope.go:117] "RemoveContainer" containerID="4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc" Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.519994 4732 generic.go:334] "Generic (PLEG): container finished" podID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerID="37544c013329ad84fb4e67dae819b9ed31475661a99de83b28a40dede9477c9c" exitCode=0 Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.520222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kptw" event={"ID":"e8664d3f-7089-4ad9-8cef-6f08cff48999","Type":"ContainerDied","Data":"37544c013329ad84fb4e67dae819b9ed31475661a99de83b28a40dede9477c9c"} Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.535915 4732 scope.go:117] "RemoveContainer" containerID="b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8" Oct 10 09:40:49 crc kubenswrapper[4732]: E1010 09:40:49.536386 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8\": container with ID starting with b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8 not found: ID does not exist" containerID="b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8" Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.536432 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8"} err="failed to get container status \"b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8\": rpc error: code = NotFound desc = could not find container \"b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8\": container with ID starting with b2dd18e6cab86257d7d5e04347998175b606c7398e81384df830b3fc7b8ba4e8 not found: ID does not exist" Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.536637 4732 scope.go:117] "RemoveContainer" containerID="401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba" Oct 10 09:40:49 crc kubenswrapper[4732]: E1010 09:40:49.537086 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba\": container with ID starting with 401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba not found: ID does not exist" containerID="401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba" Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.537126 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba"} err="failed to get container status \"401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba\": rpc error: code = NotFound desc = could not find container \"401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba\": container with ID starting with 401d1f908b8f5e7c28e8b0695217f610a5e9a4d44bbf9e8e79d5ff2a13957aba not found: ID does not exist" Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.537150 4732 scope.go:117] "RemoveContainer" containerID="4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc" Oct 10 09:40:49 crc kubenswrapper[4732]: E1010 09:40:49.537558 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc\": container with ID starting with 4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc not found: ID does not exist" containerID="4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc" Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.537603 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc"} err="failed to get container status \"4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc\": rpc error: code = NotFound desc = could not find container \"4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc\": container with ID starting with 4a39e32c0e549351057707b41a60cf940e6d9b9467796bb5515f854c002398bc not found: ID does not exist" Oct 10 09:40:49 crc kubenswrapper[4732]: I1010 09:40:49.673202 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" path="/var/lib/kubelet/pods/56179d94-92b8-4d29-ad44-3a09ffed2d09/volumes" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.021949 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.140205 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-catalog-content\") pod \"e8664d3f-7089-4ad9-8cef-6f08cff48999\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.140645 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln9f7\" (UniqueName: \"kubernetes.io/projected/e8664d3f-7089-4ad9-8cef-6f08cff48999-kube-api-access-ln9f7\") pod \"e8664d3f-7089-4ad9-8cef-6f08cff48999\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.141112 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-utilities\") pod \"e8664d3f-7089-4ad9-8cef-6f08cff48999\" (UID: \"e8664d3f-7089-4ad9-8cef-6f08cff48999\") " Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.142014 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-utilities" (OuterVolumeSpecName: "utilities") pod "e8664d3f-7089-4ad9-8cef-6f08cff48999" (UID: "e8664d3f-7089-4ad9-8cef-6f08cff48999"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.145929 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8664d3f-7089-4ad9-8cef-6f08cff48999-kube-api-access-ln9f7" (OuterVolumeSpecName: "kube-api-access-ln9f7") pod "e8664d3f-7089-4ad9-8cef-6f08cff48999" (UID: "e8664d3f-7089-4ad9-8cef-6f08cff48999"). InnerVolumeSpecName "kube-api-access-ln9f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.146019 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.193836 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8664d3f-7089-4ad9-8cef-6f08cff48999" (UID: "e8664d3f-7089-4ad9-8cef-6f08cff48999"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.248759 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8664d3f-7089-4ad9-8cef-6f08cff48999-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.248793 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln9f7\" (UniqueName: \"kubernetes.io/projected/e8664d3f-7089-4ad9-8cef-6f08cff48999-kube-api-access-ln9f7\") on node \"crc\" DevicePath \"\"" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.532460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9kptw" event={"ID":"e8664d3f-7089-4ad9-8cef-6f08cff48999","Type":"ContainerDied","Data":"be2e5e921bb12a96c4b52b294fd4924315bbf1acbd54ba747769c36da16c0a09"} Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.532806 4732 scope.go:117] "RemoveContainer" containerID="37544c013329ad84fb4e67dae819b9ed31475661a99de83b28a40dede9477c9c" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.532549 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9kptw" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.569087 4732 scope.go:117] "RemoveContainer" containerID="7efc2ca95fcc28b087682d86f9d751dae3caf465e24d248752f52ced7b6a54c2" Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.575687 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9kptw"] Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.593261 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9kptw"] Oct 10 09:40:50 crc kubenswrapper[4732]: I1010 09:40:50.603679 4732 scope.go:117] "RemoveContainer" containerID="c3698c1913aed908efc46a192154285d3c5d197fbc05b25d1a8428166c603b53" Oct 10 09:40:51 crc kubenswrapper[4732]: I1010 09:40:51.678838 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" path="/var/lib/kubelet/pods/e8664d3f-7089-4ad9-8cef-6f08cff48999/volumes" Oct 10 09:41:25 crc kubenswrapper[4732]: I1010 09:41:25.395870 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:41:25 crc kubenswrapper[4732]: I1010 09:41:25.396804 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.463244 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xqblh"] Oct 10 09:41:37 crc kubenswrapper[4732]: E1010 09:41:37.464313 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="registry-server" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464331 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="registry-server" Oct 10 09:41:37 crc kubenswrapper[4732]: E1010 09:41:37.464353 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="extract-utilities" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464361 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="extract-utilities" Oct 10 09:41:37 crc kubenswrapper[4732]: E1010 09:41:37.464380 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="extract-utilities" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464387 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="extract-utilities" Oct 10 09:41:37 crc kubenswrapper[4732]: E1010 09:41:37.464410 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="extract-content" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464416 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="extract-content" Oct 10 09:41:37 crc kubenswrapper[4732]: E1010 09:41:37.464433 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="registry-server" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464440 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="registry-server" Oct 10 09:41:37 crc kubenswrapper[4732]: E1010 09:41:37.464456 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="extract-content" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464462 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="extract-content" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464686 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8664d3f-7089-4ad9-8cef-6f08cff48999" containerName="registry-server" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.464720 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="56179d94-92b8-4d29-ad44-3a09ffed2d09" containerName="registry-server" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.466611 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.478853 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqblh"] Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.583565 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-catalog-content\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.583932 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckzrl\" (UniqueName: \"kubernetes.io/projected/2fde275b-46b9-42f3-a638-a990ab0c2a3f-kube-api-access-ckzrl\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.583999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-utilities\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.686067 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-utilities\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.686990 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-catalog-content\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.687383 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-utilities\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.687460 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckzrl\" (UniqueName: \"kubernetes.io/projected/2fde275b-46b9-42f3-a638-a990ab0c2a3f-kube-api-access-ckzrl\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.688019 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-catalog-content\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.706736 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckzrl\" (UniqueName: \"kubernetes.io/projected/2fde275b-46b9-42f3-a638-a990ab0c2a3f-kube-api-access-ckzrl\") pod \"redhat-operators-xqblh\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:37 crc kubenswrapper[4732]: I1010 09:41:37.805351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:38 crc kubenswrapper[4732]: I1010 09:41:38.303136 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xqblh"] Oct 10 09:41:39 crc kubenswrapper[4732]: I1010 09:41:39.074084 4732 generic.go:334] "Generic (PLEG): container finished" podID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerID="e990e3993cff31d7f6f4c8e394e29201ca891cf61921a047e291b378002561af" exitCode=0 Oct 10 09:41:39 crc kubenswrapper[4732]: I1010 09:41:39.074158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqblh" event={"ID":"2fde275b-46b9-42f3-a638-a990ab0c2a3f","Type":"ContainerDied","Data":"e990e3993cff31d7f6f4c8e394e29201ca891cf61921a047e291b378002561af"} Oct 10 09:41:39 crc kubenswrapper[4732]: I1010 09:41:39.074353 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqblh" event={"ID":"2fde275b-46b9-42f3-a638-a990ab0c2a3f","Type":"ContainerStarted","Data":"faa0cc6e23a70794891709b9e7000a8f913032ee6671a11e50ed3e61fe74a9aa"} Oct 10 09:41:40 crc kubenswrapper[4732]: I1010 09:41:40.095842 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqblh" event={"ID":"2fde275b-46b9-42f3-a638-a990ab0c2a3f","Type":"ContainerStarted","Data":"290afe8695828ae7ee97364f4642210a07a355b43c304c3675deb2b4c71dd74c"} Oct 10 09:41:43 crc kubenswrapper[4732]: I1010 09:41:43.131467 4732 generic.go:334] "Generic (PLEG): container finished" podID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerID="290afe8695828ae7ee97364f4642210a07a355b43c304c3675deb2b4c71dd74c" exitCode=0 Oct 10 09:41:43 crc kubenswrapper[4732]: I1010 09:41:43.131575 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqblh" event={"ID":"2fde275b-46b9-42f3-a638-a990ab0c2a3f","Type":"ContainerDied","Data":"290afe8695828ae7ee97364f4642210a07a355b43c304c3675deb2b4c71dd74c"} Oct 10 09:41:44 crc kubenswrapper[4732]: I1010 09:41:44.143671 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqblh" event={"ID":"2fde275b-46b9-42f3-a638-a990ab0c2a3f","Type":"ContainerStarted","Data":"1c995dbff91464c03c1211a8e5633a044ae7f7eaf1ffa03495b1803328ce6725"} Oct 10 09:41:44 crc kubenswrapper[4732]: I1010 09:41:44.162422 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xqblh" podStartSLOduration=2.667533004 podStartE2EDuration="7.162400711s" podCreationTimestamp="2025-10-10 09:41:37 +0000 UTC" firstStartedPulling="2025-10-10 09:41:39.076232506 +0000 UTC m=+10226.145823757" lastFinishedPulling="2025-10-10 09:41:43.571100223 +0000 UTC m=+10230.640691464" observedRunningTime="2025-10-10 09:41:44.159256625 +0000 UTC m=+10231.228847886" watchObservedRunningTime="2025-10-10 09:41:44.162400711 +0000 UTC m=+10231.231991952" Oct 10 09:41:47 crc kubenswrapper[4732]: I1010 09:41:47.806507 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:47 crc kubenswrapper[4732]: I1010 09:41:47.808460 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:49 crc kubenswrapper[4732]: I1010 09:41:49.260763 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xqblh" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="registry-server" probeResult="failure" output=< Oct 10 09:41:49 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Oct 10 09:41:49 crc kubenswrapper[4732]: > Oct 10 09:41:55 crc kubenswrapper[4732]: I1010 09:41:55.356382 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:41:55 crc kubenswrapper[4732]: I1010 09:41:55.357177 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:41:57 crc kubenswrapper[4732]: I1010 09:41:57.880759 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:41:57 crc kubenswrapper[4732]: I1010 09:41:57.951757 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:42:00 crc kubenswrapper[4732]: I1010 09:42:00.632999 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqblh"] Oct 10 09:42:00 crc kubenswrapper[4732]: I1010 09:42:00.634345 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xqblh" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="registry-server" containerID="cri-o://1c995dbff91464c03c1211a8e5633a044ae7f7eaf1ffa03495b1803328ce6725" gracePeriod=2 Oct 10 09:42:01 crc kubenswrapper[4732]: I1010 09:42:01.329666 4732 generic.go:334] "Generic (PLEG): container finished" podID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerID="1c995dbff91464c03c1211a8e5633a044ae7f7eaf1ffa03495b1803328ce6725" exitCode=0 Oct 10 09:42:01 crc kubenswrapper[4732]: I1010 09:42:01.329746 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqblh" event={"ID":"2fde275b-46b9-42f3-a638-a990ab0c2a3f","Type":"ContainerDied","Data":"1c995dbff91464c03c1211a8e5633a044ae7f7eaf1ffa03495b1803328ce6725"} Oct 10 09:42:01 crc kubenswrapper[4732]: I1010 09:42:01.994917 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.112981 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-catalog-content\") pod \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.113127 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-utilities\") pod \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.113192 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckzrl\" (UniqueName: \"kubernetes.io/projected/2fde275b-46b9-42f3-a638-a990ab0c2a3f-kube-api-access-ckzrl\") pod \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\" (UID: \"2fde275b-46b9-42f3-a638-a990ab0c2a3f\") " Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.114059 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-utilities" (OuterVolumeSpecName: "utilities") pod "2fde275b-46b9-42f3-a638-a990ab0c2a3f" (UID: "2fde275b-46b9-42f3-a638-a990ab0c2a3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.126540 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fde275b-46b9-42f3-a638-a990ab0c2a3f-kube-api-access-ckzrl" (OuterVolumeSpecName: "kube-api-access-ckzrl") pod "2fde275b-46b9-42f3-a638-a990ab0c2a3f" (UID: "2fde275b-46b9-42f3-a638-a990ab0c2a3f"). InnerVolumeSpecName "kube-api-access-ckzrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.199774 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fde275b-46b9-42f3-a638-a990ab0c2a3f" (UID: "2fde275b-46b9-42f3-a638-a990ab0c2a3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.214846 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.215159 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckzrl\" (UniqueName: \"kubernetes.io/projected/2fde275b-46b9-42f3-a638-a990ab0c2a3f-kube-api-access-ckzrl\") on node \"crc\" DevicePath \"\"" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.215171 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fde275b-46b9-42f3-a638-a990ab0c2a3f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.344272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xqblh" event={"ID":"2fde275b-46b9-42f3-a638-a990ab0c2a3f","Type":"ContainerDied","Data":"faa0cc6e23a70794891709b9e7000a8f913032ee6671a11e50ed3e61fe74a9aa"} Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.344345 4732 scope.go:117] "RemoveContainer" containerID="1c995dbff91464c03c1211a8e5633a044ae7f7eaf1ffa03495b1803328ce6725" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.344406 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xqblh" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.374880 4732 scope.go:117] "RemoveContainer" containerID="290afe8695828ae7ee97364f4642210a07a355b43c304c3675deb2b4c71dd74c" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.398492 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xqblh"] Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.403228 4732 scope.go:117] "RemoveContainer" containerID="e990e3993cff31d7f6f4c8e394e29201ca891cf61921a047e291b378002561af" Oct 10 09:42:02 crc kubenswrapper[4732]: I1010 09:42:02.413665 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xqblh"] Oct 10 09:42:03 crc kubenswrapper[4732]: I1010 09:42:03.675092 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" path="/var/lib/kubelet/pods/2fde275b-46b9-42f3-a638-a990ab0c2a3f/volumes" Oct 10 09:42:25 crc kubenswrapper[4732]: I1010 09:42:25.356547 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:42:25 crc kubenswrapper[4732]: I1010 09:42:25.358199 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:42:25 crc kubenswrapper[4732]: I1010 09:42:25.358309 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:42:25 crc kubenswrapper[4732]: I1010 09:42:25.359602 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b398c30a8dc125d1670d9a1305c808485ecb86a3d8e24829aac9c1710bfd803b"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:42:25 crc kubenswrapper[4732]: I1010 09:42:25.359740 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://b398c30a8dc125d1670d9a1305c808485ecb86a3d8e24829aac9c1710bfd803b" gracePeriod=600 Oct 10 09:42:26 crc kubenswrapper[4732]: I1010 09:42:26.606985 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="b398c30a8dc125d1670d9a1305c808485ecb86a3d8e24829aac9c1710bfd803b" exitCode=0 Oct 10 09:42:26 crc kubenswrapper[4732]: I1010 09:42:26.607060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"b398c30a8dc125d1670d9a1305c808485ecb86a3d8e24829aac9c1710bfd803b"} Oct 10 09:42:26 crc kubenswrapper[4732]: I1010 09:42:26.607405 4732 scope.go:117] "RemoveContainer" containerID="5120c3df2eeab4098628aa0aff3bcf2b75d99b1b0507c74bddd6eecf43dd82bd" Oct 10 09:42:27 crc kubenswrapper[4732]: I1010 09:42:27.617139 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b"} Oct 10 09:43:06 crc kubenswrapper[4732]: I1010 09:43:06.035362 4732 generic.go:334] "Generic (PLEG): container finished" podID="16c8a157-203f-48fc-a30d-bd652be267f5" containerID="38cdcca4b50306991716f988e4d6d45b4116850c7a5af4185206b3bbac3def11" exitCode=0 Oct 10 09:43:06 crc kubenswrapper[4732]: I1010 09:43:06.035504 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16c8a157-203f-48fc-a30d-bd652be267f5","Type":"ContainerDied","Data":"38cdcca4b50306991716f988e4d6d45b4116850c7a5af4185206b3bbac3def11"} Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.511344 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.654714 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.654773 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-workdir\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.654835 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config-secret\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.654884 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.654933 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xrdj\" (UniqueName: \"kubernetes.io/projected/16c8a157-203f-48fc-a30d-bd652be267f5-kube-api-access-2xrdj\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.655073 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ssh-key\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.655107 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-temporary\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.655135 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-config-data\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.655200 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ca-certs\") pod \"16c8a157-203f-48fc-a30d-bd652be267f5\" (UID: \"16c8a157-203f-48fc-a30d-bd652be267f5\") " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.657828 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.660158 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-config-data" (OuterVolumeSpecName: "config-data") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.661505 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.666518 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c8a157-203f-48fc-a30d-bd652be267f5-kube-api-access-2xrdj" (OuterVolumeSpecName: "kube-api-access-2xrdj") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "kube-api-access-2xrdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.675339 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.686858 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.687466 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.689004 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.710178 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "16c8a157-203f-48fc-a30d-bd652be267f5" (UID: "16c8a157-203f-48fc-a30d-bd652be267f5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758391 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xrdj\" (UniqueName: \"kubernetes.io/projected/16c8a157-203f-48fc-a30d-bd652be267f5-kube-api-access-2xrdj\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758419 4732 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758428 4732 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758438 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758448 4732 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758457 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758479 4732 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/16c8a157-203f-48fc-a30d-bd652be267f5-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758488 4732 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/16c8a157-203f-48fc-a30d-bd652be267f5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.758511 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.780245 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 10 09:43:07 crc kubenswrapper[4732]: I1010 09:43:07.860116 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 10 09:43:08 crc kubenswrapper[4732]: I1010 09:43:08.069088 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"16c8a157-203f-48fc-a30d-bd652be267f5","Type":"ContainerDied","Data":"2ad594bdf3dd4c086ddf916a19e9855f8315a7130690dbd9d1359141329751f8"} Oct 10 09:43:08 crc kubenswrapper[4732]: I1010 09:43:08.069142 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ad594bdf3dd4c086ddf916a19e9855f8315a7130690dbd9d1359141329751f8" Oct 10 09:43:08 crc kubenswrapper[4732]: I1010 09:43:08.069223 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.546269 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 10 09:43:11 crc kubenswrapper[4732]: E1010 09:43:11.547192 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c8a157-203f-48fc-a30d-bd652be267f5" containerName="tempest-tests-tempest-tests-runner" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.547205 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c8a157-203f-48fc-a30d-bd652be267f5" containerName="tempest-tests-tempest-tests-runner" Oct 10 09:43:11 crc kubenswrapper[4732]: E1010 09:43:11.547222 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="extract-utilities" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.547230 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="extract-utilities" Oct 10 09:43:11 crc kubenswrapper[4732]: E1010 09:43:11.547243 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="extract-content" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.547249 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="extract-content" Oct 10 09:43:11 crc kubenswrapper[4732]: E1010 09:43:11.547282 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="registry-server" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.547288 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="registry-server" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.547492 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fde275b-46b9-42f3-a638-a990ab0c2a3f" containerName="registry-server" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.547523 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c8a157-203f-48fc-a30d-bd652be267f5" containerName="tempest-tests-tempest-tests-runner" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.548219 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.550861 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-5ddsv" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.561185 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.643146 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj8sf\" (UniqueName: \"kubernetes.io/projected/be27b5f6-29b1-45f9-8140-c9a2da177198-kube-api-access-jj8sf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"be27b5f6-29b1-45f9-8140-c9a2da177198\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.643580 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"be27b5f6-29b1-45f9-8140-c9a2da177198\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.745317 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"be27b5f6-29b1-45f9-8140-c9a2da177198\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.745496 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj8sf\" (UniqueName: \"kubernetes.io/projected/be27b5f6-29b1-45f9-8140-c9a2da177198-kube-api-access-jj8sf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"be27b5f6-29b1-45f9-8140-c9a2da177198\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.745802 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"be27b5f6-29b1-45f9-8140-c9a2da177198\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.763474 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj8sf\" (UniqueName: \"kubernetes.io/projected/be27b5f6-29b1-45f9-8140-c9a2da177198-kube-api-access-jj8sf\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"be27b5f6-29b1-45f9-8140-c9a2da177198\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.774935 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"be27b5f6-29b1-45f9-8140-c9a2da177198\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:11 crc kubenswrapper[4732]: I1010 09:43:11.874297 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 10 09:43:12 crc kubenswrapper[4732]: I1010 09:43:12.370198 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 10 09:43:13 crc kubenswrapper[4732]: W1010 09:43:13.316810 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe27b5f6_29b1_45f9_8140_c9a2da177198.slice/crio-8d4914804c07a6a0b46ecaf7bd9e4f9375503c6f6b042e49fef1b76d68976a4b WatchSource:0}: Error finding container 8d4914804c07a6a0b46ecaf7bd9e4f9375503c6f6b042e49fef1b76d68976a4b: Status 404 returned error can't find the container with id 8d4914804c07a6a0b46ecaf7bd9e4f9375503c6f6b042e49fef1b76d68976a4b Oct 10 09:43:13 crc kubenswrapper[4732]: I1010 09:43:13.324304 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:43:14 crc kubenswrapper[4732]: I1010 09:43:14.135965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"be27b5f6-29b1-45f9-8140-c9a2da177198","Type":"ContainerStarted","Data":"8d4914804c07a6a0b46ecaf7bd9e4f9375503c6f6b042e49fef1b76d68976a4b"} Oct 10 09:43:16 crc kubenswrapper[4732]: I1010 09:43:16.166253 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"be27b5f6-29b1-45f9-8140-c9a2da177198","Type":"ContainerStarted","Data":"ffc571531a33cc3e32d1a8dece8cdacaeaa91348673b79608d0273eb57b65bd3"} Oct 10 09:43:16 crc kubenswrapper[4732]: I1010 09:43:16.186284 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=3.599753495 podStartE2EDuration="5.186263695s" podCreationTimestamp="2025-10-10 09:43:11 +0000 UTC" firstStartedPulling="2025-10-10 09:43:13.324124818 +0000 UTC m=+10320.393716059" lastFinishedPulling="2025-10-10 09:43:14.910634998 +0000 UTC m=+10321.980226259" observedRunningTime="2025-10-10 09:43:16.184783275 +0000 UTC m=+10323.254374556" watchObservedRunningTime="2025-10-10 09:43:16.186263695 +0000 UTC m=+10323.255854936" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.050389 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s2kwv/must-gather-95b45"] Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.053339 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.064653 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-s2kwv"/"kube-root-ca.crt" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.064702 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-s2kwv"/"openshift-service-ca.crt" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.064946 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-s2kwv"/"default-dockercfg-c2mrj" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.070356 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s2kwv/must-gather-95b45"] Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.155893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10e4f98d-cb01-497a-838a-8308d49241e6-must-gather-output\") pod \"must-gather-95b45\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.155943 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psqpb\" (UniqueName: \"kubernetes.io/projected/10e4f98d-cb01-497a-838a-8308d49241e6-kube-api-access-psqpb\") pod \"must-gather-95b45\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.257539 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10e4f98d-cb01-497a-838a-8308d49241e6-must-gather-output\") pod \"must-gather-95b45\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.257586 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psqpb\" (UniqueName: \"kubernetes.io/projected/10e4f98d-cb01-497a-838a-8308d49241e6-kube-api-access-psqpb\") pod \"must-gather-95b45\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.257949 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10e4f98d-cb01-497a-838a-8308d49241e6-must-gather-output\") pod \"must-gather-95b45\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.278598 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psqpb\" (UniqueName: \"kubernetes.io/projected/10e4f98d-cb01-497a-838a-8308d49241e6-kube-api-access-psqpb\") pod \"must-gather-95b45\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.379288 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:44:25 crc kubenswrapper[4732]: I1010 09:44:25.867225 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s2kwv/must-gather-95b45"] Oct 10 09:44:26 crc kubenswrapper[4732]: I1010 09:44:26.004498 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/must-gather-95b45" event={"ID":"10e4f98d-cb01-497a-838a-8308d49241e6","Type":"ContainerStarted","Data":"6e29303df399622a616b6aba73aa9b58d2e3a904217f7523579c578e71704e1b"} Oct 10 09:44:32 crc kubenswrapper[4732]: I1010 09:44:32.087161 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/must-gather-95b45" event={"ID":"10e4f98d-cb01-497a-838a-8308d49241e6","Type":"ContainerStarted","Data":"418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45"} Oct 10 09:44:32 crc kubenswrapper[4732]: I1010 09:44:32.087781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/must-gather-95b45" event={"ID":"10e4f98d-cb01-497a-838a-8308d49241e6","Type":"ContainerStarted","Data":"3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639"} Oct 10 09:44:32 crc kubenswrapper[4732]: I1010 09:44:32.127968 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s2kwv/must-gather-95b45" podStartSLOduration=2.762542658 podStartE2EDuration="7.127939788s" podCreationTimestamp="2025-10-10 09:44:25 +0000 UTC" firstStartedPulling="2025-10-10 09:44:25.880980057 +0000 UTC m=+10392.950571308" lastFinishedPulling="2025-10-10 09:44:30.246377197 +0000 UTC m=+10397.315968438" observedRunningTime="2025-10-10 09:44:32.107021452 +0000 UTC m=+10399.176612733" watchObservedRunningTime="2025-10-10 09:44:32.127939788 +0000 UTC m=+10399.197531059" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.071712 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-ljg4d"] Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.073424 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.186718 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/897e5fb7-feac-40a2-8c63-3b366b462a3c-host\") pod \"crc-debug-ljg4d\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.187144 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqmlg\" (UniqueName: \"kubernetes.io/projected/897e5fb7-feac-40a2-8c63-3b366b462a3c-kube-api-access-wqmlg\") pod \"crc-debug-ljg4d\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.289064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqmlg\" (UniqueName: \"kubernetes.io/projected/897e5fb7-feac-40a2-8c63-3b366b462a3c-kube-api-access-wqmlg\") pod \"crc-debug-ljg4d\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.289211 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/897e5fb7-feac-40a2-8c63-3b366b462a3c-host\") pod \"crc-debug-ljg4d\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.289439 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/897e5fb7-feac-40a2-8c63-3b366b462a3c-host\") pod \"crc-debug-ljg4d\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.316929 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqmlg\" (UniqueName: \"kubernetes.io/projected/897e5fb7-feac-40a2-8c63-3b366b462a3c-kube-api-access-wqmlg\") pod \"crc-debug-ljg4d\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:36 crc kubenswrapper[4732]: I1010 09:44:36.401948 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:44:37 crc kubenswrapper[4732]: I1010 09:44:37.135529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" event={"ID":"897e5fb7-feac-40a2-8c63-3b366b462a3c","Type":"ContainerStarted","Data":"621b4c3d44683f79511d0d0c26d1354587e95593b90ed83b82d0c9c78f149f52"} Oct 10 09:44:49 crc kubenswrapper[4732]: I1010 09:44:49.243841 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" event={"ID":"897e5fb7-feac-40a2-8c63-3b366b462a3c","Type":"ContainerStarted","Data":"a3f7f5e32e74e42098b9828a9a936f0553e8fb20162c4e651bbd372518594c72"} Oct 10 09:44:49 crc kubenswrapper[4732]: I1010 09:44:49.264677 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" podStartSLOduration=2.178378317 podStartE2EDuration="13.264659296s" podCreationTimestamp="2025-10-10 09:44:36 +0000 UTC" firstStartedPulling="2025-10-10 09:44:36.44358251 +0000 UTC m=+10403.513173761" lastFinishedPulling="2025-10-10 09:44:47.529863459 +0000 UTC m=+10414.599454740" observedRunningTime="2025-10-10 09:44:49.258336664 +0000 UTC m=+10416.327927915" watchObservedRunningTime="2025-10-10 09:44:49.264659296 +0000 UTC m=+10416.334250547" Oct 10 09:44:55 crc kubenswrapper[4732]: I1010 09:44:55.355876 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:44:55 crc kubenswrapper[4732]: I1010 09:44:55.356509 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.170466 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x"] Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.174085 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.176193 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.178190 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.183161 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x"] Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.203022 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df609cc8-9816-42bf-b035-f96cda8e0498-config-volume\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.203105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hc7\" (UniqueName: \"kubernetes.io/projected/df609cc8-9816-42bf-b035-f96cda8e0498-kube-api-access-m2hc7\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.203330 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df609cc8-9816-42bf-b035-f96cda8e0498-secret-volume\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.305341 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df609cc8-9816-42bf-b035-f96cda8e0498-config-volume\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.305452 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hc7\" (UniqueName: \"kubernetes.io/projected/df609cc8-9816-42bf-b035-f96cda8e0498-kube-api-access-m2hc7\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.305637 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df609cc8-9816-42bf-b035-f96cda8e0498-secret-volume\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.306443 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df609cc8-9816-42bf-b035-f96cda8e0498-config-volume\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.310905 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df609cc8-9816-42bf-b035-f96cda8e0498-secret-volume\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:00 crc kubenswrapper[4732]: I1010 09:45:00.324505 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hc7\" (UniqueName: \"kubernetes.io/projected/df609cc8-9816-42bf-b035-f96cda8e0498-kube-api-access-m2hc7\") pod \"collect-profiles-29334825-kf96x\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:02 crc kubenswrapper[4732]: I1010 09:45:02.104473 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:02 crc kubenswrapper[4732]: I1010 09:45:02.666706 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x"] Oct 10 09:45:03 crc kubenswrapper[4732]: I1010 09:45:03.423324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" event={"ID":"df609cc8-9816-42bf-b035-f96cda8e0498","Type":"ContainerStarted","Data":"73fe5459635be2d163da67c1f2af55223235987baff8939e443838d6b015d6ed"} Oct 10 09:45:03 crc kubenswrapper[4732]: I1010 09:45:03.423713 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" event={"ID":"df609cc8-9816-42bf-b035-f96cda8e0498","Type":"ContainerStarted","Data":"898214aa06c40228ae7c35dd780b85b2a286dd7abc51bc403480e6d9fe6e91ca"} Oct 10 09:45:03 crc kubenswrapper[4732]: I1010 09:45:03.448683 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" podStartSLOduration=3.448656573 podStartE2EDuration="3.448656573s" podCreationTimestamp="2025-10-10 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:45:03.445280352 +0000 UTC m=+10430.514871613" watchObservedRunningTime="2025-10-10 09:45:03.448656573 +0000 UTC m=+10430.518247824" Oct 10 09:45:04 crc kubenswrapper[4732]: I1010 09:45:04.435202 4732 generic.go:334] "Generic (PLEG): container finished" podID="df609cc8-9816-42bf-b035-f96cda8e0498" containerID="73fe5459635be2d163da67c1f2af55223235987baff8939e443838d6b015d6ed" exitCode=0 Oct 10 09:45:04 crc kubenswrapper[4732]: I1010 09:45:04.435274 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" event={"ID":"df609cc8-9816-42bf-b035-f96cda8e0498","Type":"ContainerDied","Data":"73fe5459635be2d163da67c1f2af55223235987baff8939e443838d6b015d6ed"} Oct 10 09:45:05 crc kubenswrapper[4732]: I1010 09:45:05.817981 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:05 crc kubenswrapper[4732]: I1010 09:45:05.924407 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df609cc8-9816-42bf-b035-f96cda8e0498-secret-volume\") pod \"df609cc8-9816-42bf-b035-f96cda8e0498\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " Oct 10 09:45:05 crc kubenswrapper[4732]: I1010 09:45:05.924722 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2hc7\" (UniqueName: \"kubernetes.io/projected/df609cc8-9816-42bf-b035-f96cda8e0498-kube-api-access-m2hc7\") pod \"df609cc8-9816-42bf-b035-f96cda8e0498\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " Oct 10 09:45:05 crc kubenswrapper[4732]: I1010 09:45:05.924855 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df609cc8-9816-42bf-b035-f96cda8e0498-config-volume\") pod \"df609cc8-9816-42bf-b035-f96cda8e0498\" (UID: \"df609cc8-9816-42bf-b035-f96cda8e0498\") " Oct 10 09:45:05 crc kubenswrapper[4732]: I1010 09:45:05.925550 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df609cc8-9816-42bf-b035-f96cda8e0498-config-volume" (OuterVolumeSpecName: "config-volume") pod "df609cc8-9816-42bf-b035-f96cda8e0498" (UID: "df609cc8-9816-42bf-b035-f96cda8e0498"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.028092 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df609cc8-9816-42bf-b035-f96cda8e0498-config-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.463181 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" event={"ID":"df609cc8-9816-42bf-b035-f96cda8e0498","Type":"ContainerDied","Data":"898214aa06c40228ae7c35dd780b85b2a286dd7abc51bc403480e6d9fe6e91ca"} Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.463221 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="898214aa06c40228ae7c35dd780b85b2a286dd7abc51bc403480e6d9fe6e91ca" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.463337 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29334825-kf96x" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.509384 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df609cc8-9816-42bf-b035-f96cda8e0498-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "df609cc8-9816-42bf-b035-f96cda8e0498" (UID: "df609cc8-9816-42bf-b035-f96cda8e0498"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.510020 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df609cc8-9816-42bf-b035-f96cda8e0498-kube-api-access-m2hc7" (OuterVolumeSpecName: "kube-api-access-m2hc7") pod "df609cc8-9816-42bf-b035-f96cda8e0498" (UID: "df609cc8-9816-42bf-b035-f96cda8e0498"). InnerVolumeSpecName "kube-api-access-m2hc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.538486 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2hc7\" (UniqueName: \"kubernetes.io/projected/df609cc8-9816-42bf-b035-f96cda8e0498-kube-api-access-m2hc7\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.538524 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df609cc8-9816-42bf-b035-f96cda8e0498-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.929141 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt"] Oct 10 09:45:06 crc kubenswrapper[4732]: I1010 09:45:06.940548 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29334780-rtgxt"] Oct 10 09:45:07 crc kubenswrapper[4732]: I1010 09:45:07.672968 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="324b63ee-f5a8-4cde-863f-450ffae67192" path="/var/lib/kubelet/pods/324b63ee-f5a8-4cde-863f-450ffae67192/volumes" Oct 10 09:45:25 crc kubenswrapper[4732]: I1010 09:45:25.355795 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:45:25 crc kubenswrapper[4732]: I1010 09:45:25.356345 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:45:34 crc kubenswrapper[4732]: I1010 09:45:34.783931 4732 generic.go:334] "Generic (PLEG): container finished" podID="897e5fb7-feac-40a2-8c63-3b366b462a3c" containerID="a3f7f5e32e74e42098b9828a9a936f0553e8fb20162c4e651bbd372518594c72" exitCode=0 Oct 10 09:45:34 crc kubenswrapper[4732]: I1010 09:45:34.784039 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" event={"ID":"897e5fb7-feac-40a2-8c63-3b366b462a3c","Type":"ContainerDied","Data":"a3f7f5e32e74e42098b9828a9a936f0553e8fb20162c4e651bbd372518594c72"} Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.908306 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.917483 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqmlg\" (UniqueName: \"kubernetes.io/projected/897e5fb7-feac-40a2-8c63-3b366b462a3c-kube-api-access-wqmlg\") pod \"897e5fb7-feac-40a2-8c63-3b366b462a3c\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.918157 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/897e5fb7-feac-40a2-8c63-3b366b462a3c-host\") pod \"897e5fb7-feac-40a2-8c63-3b366b462a3c\" (UID: \"897e5fb7-feac-40a2-8c63-3b366b462a3c\") " Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.918665 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/897e5fb7-feac-40a2-8c63-3b366b462a3c-host" (OuterVolumeSpecName: "host") pod "897e5fb7-feac-40a2-8c63-3b366b462a3c" (UID: "897e5fb7-feac-40a2-8c63-3b366b462a3c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.919171 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/897e5fb7-feac-40a2-8c63-3b366b462a3c-host\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.929771 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/897e5fb7-feac-40a2-8c63-3b366b462a3c-kube-api-access-wqmlg" (OuterVolumeSpecName: "kube-api-access-wqmlg") pod "897e5fb7-feac-40a2-8c63-3b366b462a3c" (UID: "897e5fb7-feac-40a2-8c63-3b366b462a3c"). InnerVolumeSpecName "kube-api-access-wqmlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.955317 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-ljg4d"] Oct 10 09:45:35 crc kubenswrapper[4732]: I1010 09:45:35.971108 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-ljg4d"] Oct 10 09:45:36 crc kubenswrapper[4732]: I1010 09:45:36.020738 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqmlg\" (UniqueName: \"kubernetes.io/projected/897e5fb7-feac-40a2-8c63-3b366b462a3c-kube-api-access-wqmlg\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:36 crc kubenswrapper[4732]: I1010 09:45:36.824589 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="621b4c3d44683f79511d0d0c26d1354587e95593b90ed83b82d0c9c78f149f52" Oct 10 09:45:36 crc kubenswrapper[4732]: I1010 09:45:36.824662 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-ljg4d" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.109292 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-r2rjk"] Oct 10 09:45:37 crc kubenswrapper[4732]: E1010 09:45:37.109797 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df609cc8-9816-42bf-b035-f96cda8e0498" containerName="collect-profiles" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.109819 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="df609cc8-9816-42bf-b035-f96cda8e0498" containerName="collect-profiles" Oct 10 09:45:37 crc kubenswrapper[4732]: E1010 09:45:37.109869 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="897e5fb7-feac-40a2-8c63-3b366b462a3c" containerName="container-00" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.109880 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="897e5fb7-feac-40a2-8c63-3b366b462a3c" containerName="container-00" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.110171 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="897e5fb7-feac-40a2-8c63-3b366b462a3c" containerName="container-00" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.110197 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="df609cc8-9816-42bf-b035-f96cda8e0498" containerName="collect-profiles" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.111061 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.139175 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a86b9e0e-0476-4be2-b205-29858eeadf72-host\") pod \"crc-debug-r2rjk\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.139335 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nk6f\" (UniqueName: \"kubernetes.io/projected/a86b9e0e-0476-4be2-b205-29858eeadf72-kube-api-access-9nk6f\") pod \"crc-debug-r2rjk\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.240946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a86b9e0e-0476-4be2-b205-29858eeadf72-host\") pod \"crc-debug-r2rjk\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.241098 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a86b9e0e-0476-4be2-b205-29858eeadf72-host\") pod \"crc-debug-r2rjk\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.241108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nk6f\" (UniqueName: \"kubernetes.io/projected/a86b9e0e-0476-4be2-b205-29858eeadf72-kube-api-access-9nk6f\") pod \"crc-debug-r2rjk\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.262855 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nk6f\" (UniqueName: \"kubernetes.io/projected/a86b9e0e-0476-4be2-b205-29858eeadf72-kube-api-access-9nk6f\") pod \"crc-debug-r2rjk\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.427514 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.673390 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="897e5fb7-feac-40a2-8c63-3b366b462a3c" path="/var/lib/kubelet/pods/897e5fb7-feac-40a2-8c63-3b366b462a3c/volumes" Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.837123 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" event={"ID":"a86b9e0e-0476-4be2-b205-29858eeadf72","Type":"ContainerStarted","Data":"153a960bcc689acf6d2a6cd3039a0a7e0f208cff6b7a74ece489d33d94080680"} Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.837219 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" event={"ID":"a86b9e0e-0476-4be2-b205-29858eeadf72","Type":"ContainerStarted","Data":"e641f0ad68583993d64b1c635758cb4563fdfb4b873c7d66846295acf415f267"} Oct 10 09:45:37 crc kubenswrapper[4732]: I1010 09:45:37.862715 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" podStartSLOduration=0.862660175 podStartE2EDuration="862.660175ms" podCreationTimestamp="2025-10-10 09:45:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-10 09:45:37.850759603 +0000 UTC m=+10464.920350854" watchObservedRunningTime="2025-10-10 09:45:37.862660175 +0000 UTC m=+10464.932251426" Oct 10 09:45:38 crc kubenswrapper[4732]: I1010 09:45:38.847872 4732 generic.go:334] "Generic (PLEG): container finished" podID="a86b9e0e-0476-4be2-b205-29858eeadf72" containerID="153a960bcc689acf6d2a6cd3039a0a7e0f208cff6b7a74ece489d33d94080680" exitCode=0 Oct 10 09:45:38 crc kubenswrapper[4732]: I1010 09:45:38.847936 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" event={"ID":"a86b9e0e-0476-4be2-b205-29858eeadf72","Type":"ContainerDied","Data":"153a960bcc689acf6d2a6cd3039a0a7e0f208cff6b7a74ece489d33d94080680"} Oct 10 09:45:39 crc kubenswrapper[4732]: I1010 09:45:39.711400 4732 scope.go:117] "RemoveContainer" containerID="b4683358975482982cf85e245aa1a4e3853cfa0ac3fa093cf226fb5881dd46c7" Oct 10 09:45:39 crc kubenswrapper[4732]: I1010 09:45:39.957454 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.002825 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nk6f\" (UniqueName: \"kubernetes.io/projected/a86b9e0e-0476-4be2-b205-29858eeadf72-kube-api-access-9nk6f\") pod \"a86b9e0e-0476-4be2-b205-29858eeadf72\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.002977 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a86b9e0e-0476-4be2-b205-29858eeadf72-host\") pod \"a86b9e0e-0476-4be2-b205-29858eeadf72\" (UID: \"a86b9e0e-0476-4be2-b205-29858eeadf72\") " Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.003652 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a86b9e0e-0476-4be2-b205-29858eeadf72-host" (OuterVolumeSpecName: "host") pod "a86b9e0e-0476-4be2-b205-29858eeadf72" (UID: "a86b9e0e-0476-4be2-b205-29858eeadf72"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.020055 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86b9e0e-0476-4be2-b205-29858eeadf72-kube-api-access-9nk6f" (OuterVolumeSpecName: "kube-api-access-9nk6f") pod "a86b9e0e-0476-4be2-b205-29858eeadf72" (UID: "a86b9e0e-0476-4be2-b205-29858eeadf72"). InnerVolumeSpecName "kube-api-access-9nk6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.105095 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nk6f\" (UniqueName: \"kubernetes.io/projected/a86b9e0e-0476-4be2-b205-29858eeadf72-kube-api-access-9nk6f\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.105131 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a86b9e0e-0476-4be2-b205-29858eeadf72-host\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.627920 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-r2rjk"] Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.640992 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-r2rjk"] Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.869231 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e641f0ad68583993d64b1c635758cb4563fdfb4b873c7d66846295acf415f267" Oct 10 09:45:40 crc kubenswrapper[4732]: I1010 09:45:40.869296 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-r2rjk" Oct 10 09:45:41 crc kubenswrapper[4732]: I1010 09:45:41.672251 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86b9e0e-0476-4be2-b205-29858eeadf72" path="/var/lib/kubelet/pods/a86b9e0e-0476-4be2-b205-29858eeadf72/volumes" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.390368 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-25m4n"] Oct 10 09:45:42 crc kubenswrapper[4732]: E1010 09:45:42.390844 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86b9e0e-0476-4be2-b205-29858eeadf72" containerName="container-00" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.390863 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86b9e0e-0476-4be2-b205-29858eeadf72" containerName="container-00" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.391127 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86b9e0e-0476-4be2-b205-29858eeadf72" containerName="container-00" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.391940 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.451493 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba3c27c-e134-4744-92c2-0f195812eebd-host\") pod \"crc-debug-25m4n\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.451724 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxx9l\" (UniqueName: \"kubernetes.io/projected/aba3c27c-e134-4744-92c2-0f195812eebd-kube-api-access-pxx9l\") pod \"crc-debug-25m4n\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.553179 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxx9l\" (UniqueName: \"kubernetes.io/projected/aba3c27c-e134-4744-92c2-0f195812eebd-kube-api-access-pxx9l\") pod \"crc-debug-25m4n\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.553244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba3c27c-e134-4744-92c2-0f195812eebd-host\") pod \"crc-debug-25m4n\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.553434 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba3c27c-e134-4744-92c2-0f195812eebd-host\") pod \"crc-debug-25m4n\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.570079 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxx9l\" (UniqueName: \"kubernetes.io/projected/aba3c27c-e134-4744-92c2-0f195812eebd-kube-api-access-pxx9l\") pod \"crc-debug-25m4n\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.713515 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:42 crc kubenswrapper[4732]: I1010 09:45:42.887291 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-25m4n" event={"ID":"aba3c27c-e134-4744-92c2-0f195812eebd","Type":"ContainerStarted","Data":"01c0106c350829174f201da2fc8851b4d39e56105483ff7d47c90503b485f63e"} Oct 10 09:45:43 crc kubenswrapper[4732]: I1010 09:45:43.902797 4732 generic.go:334] "Generic (PLEG): container finished" podID="aba3c27c-e134-4744-92c2-0f195812eebd" containerID="2f2e74272281c6e0ba423ce3ec7e124162f0d83593bd518fcb93503b9063c08f" exitCode=0 Oct 10 09:45:43 crc kubenswrapper[4732]: I1010 09:45:43.902873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/crc-debug-25m4n" event={"ID":"aba3c27c-e134-4744-92c2-0f195812eebd","Type":"ContainerDied","Data":"2f2e74272281c6e0ba423ce3ec7e124162f0d83593bd518fcb93503b9063c08f"} Oct 10 09:45:43 crc kubenswrapper[4732]: I1010 09:45:43.957709 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-25m4n"] Oct 10 09:45:43 crc kubenswrapper[4732]: I1010 09:45:43.973287 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s2kwv/crc-debug-25m4n"] Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.014920 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.103207 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxx9l\" (UniqueName: \"kubernetes.io/projected/aba3c27c-e134-4744-92c2-0f195812eebd-kube-api-access-pxx9l\") pod \"aba3c27c-e134-4744-92c2-0f195812eebd\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.103543 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba3c27c-e134-4744-92c2-0f195812eebd-host\") pod \"aba3c27c-e134-4744-92c2-0f195812eebd\" (UID: \"aba3c27c-e134-4744-92c2-0f195812eebd\") " Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.104127 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aba3c27c-e134-4744-92c2-0f195812eebd-host" (OuterVolumeSpecName: "host") pod "aba3c27c-e134-4744-92c2-0f195812eebd" (UID: "aba3c27c-e134-4744-92c2-0f195812eebd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.110198 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba3c27c-e134-4744-92c2-0f195812eebd-kube-api-access-pxx9l" (OuterVolumeSpecName: "kube-api-access-pxx9l") pod "aba3c27c-e134-4744-92c2-0f195812eebd" (UID: "aba3c27c-e134-4744-92c2-0f195812eebd"). InnerVolumeSpecName "kube-api-access-pxx9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.206262 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxx9l\" (UniqueName: \"kubernetes.io/projected/aba3c27c-e134-4744-92c2-0f195812eebd-kube-api-access-pxx9l\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.206304 4732 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aba3c27c-e134-4744-92c2-0f195812eebd-host\") on node \"crc\" DevicePath \"\"" Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.673659 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba3c27c-e134-4744-92c2-0f195812eebd" path="/var/lib/kubelet/pods/aba3c27c-e134-4744-92c2-0f195812eebd/volumes" Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.919442 4732 scope.go:117] "RemoveContainer" containerID="2f2e74272281c6e0ba423ce3ec7e124162f0d83593bd518fcb93503b9063c08f" Oct 10 09:45:45 crc kubenswrapper[4732]: I1010 09:45:45.919488 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/crc-debug-25m4n" Oct 10 09:45:55 crc kubenswrapper[4732]: I1010 09:45:55.361885 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:45:55 crc kubenswrapper[4732]: I1010 09:45:55.362535 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 10 09:45:55 crc kubenswrapper[4732]: I1010 09:45:55.362599 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-292kd" Oct 10 09:45:55 crc kubenswrapper[4732]: I1010 09:45:55.363518 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b"} pod="openshift-machine-config-operator/machine-config-daemon-292kd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 10 09:45:55 crc kubenswrapper[4732]: I1010 09:45:55.363587 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" containerID="cri-o://a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" gracePeriod=600 Oct 10 09:45:55 crc kubenswrapper[4732]: E1010 09:45:55.493029 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:45:56 crc kubenswrapper[4732]: I1010 09:45:56.067634 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" exitCode=0 Oct 10 09:45:56 crc kubenswrapper[4732]: I1010 09:45:56.067755 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerDied","Data":"a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b"} Oct 10 09:45:56 crc kubenswrapper[4732]: I1010 09:45:56.068142 4732 scope.go:117] "RemoveContainer" containerID="b398c30a8dc125d1670d9a1305c808485ecb86a3d8e24829aac9c1710bfd803b" Oct 10 09:45:56 crc kubenswrapper[4732]: I1010 09:45:56.069551 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:45:56 crc kubenswrapper[4732]: E1010 09:45:56.070628 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:45:58 crc kubenswrapper[4732]: I1010 09:45:58.530620 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_18c40f62-0e3e-413b-ba46-ba26ea267b7f/init-config-reloader/0.log" Oct 10 09:45:58 crc kubenswrapper[4732]: I1010 09:45:58.687957 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_18c40f62-0e3e-413b-ba46-ba26ea267b7f/init-config-reloader/0.log" Oct 10 09:45:58 crc kubenswrapper[4732]: I1010 09:45:58.724435 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_18c40f62-0e3e-413b-ba46-ba26ea267b7f/alertmanager/0.log" Oct 10 09:45:58 crc kubenswrapper[4732]: I1010 09:45:58.869375 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_18c40f62-0e3e-413b-ba46-ba26ea267b7f/config-reloader/0.log" Oct 10 09:45:58 crc kubenswrapper[4732]: I1010 09:45:58.909939 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f9b41169-6ef6-4089-a8d2-b528da4862e9/aodh-api/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.086346 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f9b41169-6ef6-4089-a8d2-b528da4862e9/aodh-evaluator/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.115379 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f9b41169-6ef6-4089-a8d2-b528da4862e9/aodh-listener/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.150200 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_f9b41169-6ef6-4089-a8d2-b528da4862e9/aodh-notifier/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.312718 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cc7b7c678-fwj45_e0ebdd0f-4d64-4973-bd61-1982ae84e68f/barbican-api/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.330263 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5cc7b7c678-fwj45_e0ebdd0f-4d64-4973-bd61-1982ae84e68f/barbican-api-log/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.622341 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67c968cc46-jcgx7_7af9c216-845f-4a2a-b87c-28efa0bb0b8e/barbican-keystone-listener/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.890878 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6765975-zgmjf_a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0/barbican-worker/0.log" Oct 10 09:45:59 crc kubenswrapper[4732]: I1010 09:45:59.896845 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67c968cc46-jcgx7_7af9c216-845f-4a2a-b87c-28efa0bb0b8e/barbican-keystone-listener-log/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.020209 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6765975-zgmjf_a3c08da0-8c8b-4d8a-893d-b3d77a2acdc0/barbican-worker-log/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.107505 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-qw2mn_48a3c76e-8709-4c95-b087-4a5d83083c97/bootstrap-openstack-openstack-cell1/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.326834 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_76531d1c-d162-4a95-9764-be79581cd832/ceilometer-central-agent/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.371541 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_76531d1c-d162-4a95-9764-be79581cd832/ceilometer-notification-agent/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.516060 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_76531d1c-d162-4a95-9764-be79581cd832/proxy-httpd/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.527555 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_76531d1c-d162-4a95-9764-be79581cd832/sg-core/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.750036 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_917ad3d8-97f9-4994-9f75-a6f9307137c9/cinder-api-log/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.765547 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_917ad3d8-97f9-4994-9f75-a6f9307137c9/cinder-api/0.log" Oct 10 09:46:00 crc kubenswrapper[4732]: I1010 09:46:00.932296 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3993a84-cfe6-47e0-9f72-5d56aa71cdba/cinder-scheduler/0.log" Oct 10 09:46:01 crc kubenswrapper[4732]: I1010 09:46:01.002011 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3993a84-cfe6-47e0-9f72-5d56aa71cdba/probe/0.log" Oct 10 09:46:01 crc kubenswrapper[4732]: I1010 09:46:01.139134 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-845gd_f58d836d-542b-46f8-8446-2ab8874fb834/configure-network-openstack-openstack-cell1/0.log" Oct 10 09:46:01 crc kubenswrapper[4732]: I1010 09:46:01.296340 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-ksh2z_1690368c-3a87-48f6-9b22-ca371199b4dd/configure-os-openstack-openstack-cell1/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.106592 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8ff6dbf6c-6kbs5_000c5dfc-e5f4-49de-858d-d8a01e3acebc/init/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.264779 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8ff6dbf6c-6kbs5_000c5dfc-e5f4-49de-858d-d8a01e3acebc/init/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.334608 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8ff6dbf6c-6kbs5_000c5dfc-e5f4-49de-858d-d8a01e3acebc/dnsmasq-dns/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.344094 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-26bh4_6a758950-7ac7-433f-8d14-1a39efae029b/download-cache-openstack-openstack-cell1/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.530987 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2799c38e-8a3c-47ca-95c8-c401ce8f5c54/glance-log/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.561387 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2799c38e-8a3c-47ca-95c8-c401ce8f5c54/glance-httpd/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.733658 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f566fa71-b966-4a19-8843-9d2530fe37a2/glance-httpd/0.log" Oct 10 09:46:02 crc kubenswrapper[4732]: I1010 09:46:02.734122 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f566fa71-b966-4a19-8843-9d2530fe37a2/glance-log/0.log" Oct 10 09:46:03 crc kubenswrapper[4732]: I1010 09:46:03.333591 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5844bd7ddb-sds75_96365079-5a8a-4b2b-87f0-502a7b09ed3c/heat-engine/0.log" Oct 10 09:46:03 crc kubenswrapper[4732]: I1010 09:46:03.613983 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-575d45d5d7-xbm4f_ad059a3e-2246-4d06-bc00-e030379d69d5/heat-api/0.log" Oct 10 09:46:03 crc kubenswrapper[4732]: I1010 09:46:03.653083 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7dbb656958-mt7c9_3ba0bd8d-c5b5-4ac5-8abf-260d7f7dc652/heat-cfnapi/0.log" Oct 10 09:46:03 crc kubenswrapper[4732]: I1010 09:46:03.945429 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-swd76_9f873c30-bbd6-4e94-9cff-9c998ab92b9c/install-certs-openstack-openstack-cell1/0.log" Oct 10 09:46:04 crc kubenswrapper[4732]: I1010 09:46:04.039623 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-567cbfd676-jtwjz_19d08ea8-b473-4840-9663-9f74ed2cf748/horizon/0.log" Oct 10 09:46:04 crc kubenswrapper[4732]: I1010 09:46:04.250209 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-thp4h_4a2e5b09-abde-4b6e-925b-b2b2ff1a1346/install-os-openstack-openstack-cell1/0.log" Oct 10 09:46:04 crc kubenswrapper[4732]: I1010 09:46:04.521219 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29334781-sb8rl_c4a4459f-b36d-49c9-9444-e1481c7d1087/keystone-cron/0.log" Oct 10 09:46:04 crc kubenswrapper[4732]: I1010 09:46:04.602621 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-567cbfd676-jtwjz_19d08ea8-b473-4840-9663-9f74ed2cf748/horizon-log/0.log" Oct 10 09:46:04 crc kubenswrapper[4732]: I1010 09:46:04.700499 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4c4e511b-2d1b-4771-a029-65d752c5728b/kube-state-metrics/0.log" Oct 10 09:46:04 crc kubenswrapper[4732]: I1010 09:46:04.904293 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-vqcwk_e0f22220-e678-461f-b4bf-fd1c0415a490/libvirt-openstack-openstack-cell1/0.log" Oct 10 09:46:05 crc kubenswrapper[4732]: I1010 09:46:05.345629 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-777778bb9f-g6hc7_8b09380c-07fd-4b37-93bd-c8c44f496ae4/keystone-api/0.log" Oct 10 09:46:05 crc kubenswrapper[4732]: I1010 09:46:05.595925 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c9989b5f-7clkk_aa03b38d-f0b0-4556-b71d-1abc28d2eb82/neutron-httpd/0.log" Oct 10 09:46:05 crc kubenswrapper[4732]: I1010 09:46:05.790409 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-57c9989b5f-7clkk_aa03b38d-f0b0-4556-b71d-1abc28d2eb82/neutron-api/0.log" Oct 10 09:46:05 crc kubenswrapper[4732]: I1010 09:46:05.849986 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-l5xlf_6dd18d28-5db6-4fff-96a8-320afc9e7638/neutron-dhcp-openstack-openstack-cell1/0.log" Oct 10 09:46:06 crc kubenswrapper[4732]: I1010 09:46:06.063674 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-jf2wj_015e3b36-7470-428f-8e34-37a633345b2e/neutron-metadata-openstack-openstack-cell1/0.log" Oct 10 09:46:06 crc kubenswrapper[4732]: I1010 09:46:06.167714 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-g7zhx_9a0f923f-84b2-44b1-9030-636b08eff952/neutron-sriov-openstack-openstack-cell1/0.log" Oct 10 09:46:06 crc kubenswrapper[4732]: I1010 09:46:06.647740 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_84dd5777-e881-497c-845b-97a5d608989c/nova-api-log/0.log" Oct 10 09:46:06 crc kubenswrapper[4732]: I1010 09:46:06.912738 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3643c9df-0a5f-4b91-a164-a61498f3725c/nova-cell0-conductor-conductor/0.log" Oct 10 09:46:06 crc kubenswrapper[4732]: I1010 09:46:06.933261 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_84dd5777-e881-497c-845b-97a5d608989c/nova-api-api/0.log" Oct 10 09:46:07 crc kubenswrapper[4732]: I1010 09:46:07.136305 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4b06e190-6280-4b02-9712-214229f1c30f/nova-cell1-conductor-conductor/0.log" Oct 10 09:46:07 crc kubenswrapper[4732]: I1010 09:46:07.310053 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_63c5250c-8cab-4d8c-a3c5-e36be4ec6528/nova-cell1-novncproxy-novncproxy/0.log" Oct 10 09:46:07 crc kubenswrapper[4732]: I1010 09:46:07.669343 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellm79zn_b0c399d3-48d7-4316-931f-2115e341ce3d/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Oct 10 09:46:07 crc kubenswrapper[4732]: I1010 09:46:07.800784 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-7j7kl_dfd6910b-38c8-4a2c-9095-16c82b02e3e0/nova-cell1-openstack-openstack-cell1/0.log" Oct 10 09:46:07 crc kubenswrapper[4732]: I1010 09:46:07.919570 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff6afe7c-aa67-4af3-9de1-f1046d7ea386/nova-metadata-log/0.log" Oct 10 09:46:08 crc kubenswrapper[4732]: I1010 09:46:08.307245 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c9e2f176-f337-4dda-af59-d839d8985489/nova-scheduler-scheduler/0.log" Oct 10 09:46:08 crc kubenswrapper[4732]: I1010 09:46:08.515548 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05bbbd5f-b1ea-4a6d-9787-082d27fbfae6/mysql-bootstrap/0.log" Oct 10 09:46:08 crc kubenswrapper[4732]: I1010 09:46:08.670143 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05bbbd5f-b1ea-4a6d-9787-082d27fbfae6/mysql-bootstrap/0.log" Oct 10 09:46:08 crc kubenswrapper[4732]: I1010 09:46:08.703931 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_05bbbd5f-b1ea-4a6d-9787-082d27fbfae6/galera/0.log" Oct 10 09:46:08 crc kubenswrapper[4732]: I1010 09:46:08.838229 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ff6afe7c-aa67-4af3-9de1-f1046d7ea386/nova-metadata-metadata/0.log" Oct 10 09:46:08 crc kubenswrapper[4732]: I1010 09:46:08.901371 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bf925f45-1e6a-41dc-bc76-c554a0a21636/mysql-bootstrap/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.079076 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bf925f45-1e6a-41dc-bc76-c554a0a21636/mysql-bootstrap/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.125478 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_bf925f45-1e6a-41dc-bc76-c554a0a21636/galera/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.275555 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c449ab79-27fb-47ce-8e4c-fc160420cddf/openstackclient/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.412145 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99/openstack-network-exporter/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.501482 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5cb6cb9a-2b53-47db-a421-c0a9f8e0ea99/ovn-northd/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.660627 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:46:09 crc kubenswrapper[4732]: E1010 09:46:09.660894 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.711497 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-sczt2_8d6feded-06a9-476b-8ce0-9856c8ac5de2/ovn-openstack-openstack-cell1/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.845977 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55afdb4c-6296-4aed-881f-69bc7cfa7f2b/openstack-network-exporter/0.log" Oct 10 09:46:09 crc kubenswrapper[4732]: I1010 09:46:09.896265 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55afdb4c-6296-4aed-881f-69bc7cfa7f2b/ovsdbserver-nb/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.065209 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4/openstack-network-exporter/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.116943 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_67f7fa9a-9c8b-4d6f-92ed-ee314be0fdc4/ovsdbserver-nb/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.297766 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_defba928-0686-4eb5-b82f-a3d81310408c/openstack-network-exporter/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.379841 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_defba928-0686-4eb5-b82f-a3d81310408c/ovsdbserver-nb/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.466854 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_855f8bf2-6c81-4281-9363-180138a2aea0/openstack-network-exporter/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.571128 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_855f8bf2-6c81-4281-9363-180138a2aea0/ovsdbserver-sb/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.703483 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c9473b7d-16bd-43aa-b01b-c977c828b6ad/openstack-network-exporter/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.780460 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_c9473b7d-16bd-43aa-b01b-c977c828b6ad/ovsdbserver-sb/0.log" Oct 10 09:46:10 crc kubenswrapper[4732]: I1010 09:46:10.918798 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_2755dd3e-b135-4dcc-8016-7f1034232bb9/openstack-network-exporter/0.log" Oct 10 09:46:11 crc kubenswrapper[4732]: I1010 09:46:11.011916 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_2755dd3e-b135-4dcc-8016-7f1034232bb9/ovsdbserver-sb/0.log" Oct 10 09:46:11 crc kubenswrapper[4732]: I1010 09:46:11.383044 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-697b9444d8-dst4f_cba9debd-ecde-489e-af5e-bbd2b4d0321f/placement-log/0.log" Oct 10 09:46:11 crc kubenswrapper[4732]: I1010 09:46:11.396274 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-697b9444d8-dst4f_cba9debd-ecde-489e-af5e-bbd2b4d0321f/placement-api/0.log" Oct 10 09:46:12 crc kubenswrapper[4732]: I1010 09:46:12.167421 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c6g8s8_56daaa72-0707-450e-946f-649e04f9a0bc/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 10 09:46:12 crc kubenswrapper[4732]: I1010 09:46:12.300280 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe/init-config-reloader/0.log" Oct 10 09:46:12 crc kubenswrapper[4732]: I1010 09:46:12.490471 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe/prometheus/0.log" Oct 10 09:46:12 crc kubenswrapper[4732]: I1010 09:46:12.503401 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe/config-reloader/0.log" Oct 10 09:46:12 crc kubenswrapper[4732]: I1010 09:46:12.525481 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe/init-config-reloader/0.log" Oct 10 09:46:12 crc kubenswrapper[4732]: I1010 09:46:12.664222 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0d5ec3e6-9faf-4457-a0ef-8050e9c8cabe/thanos-sidecar/0.log" Oct 10 09:46:12 crc kubenswrapper[4732]: I1010 09:46:12.761108 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f822368-67da-407d-9a7e-def860134a98/setup-container/0.log" Oct 10 09:46:13 crc kubenswrapper[4732]: I1010 09:46:13.001939 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f822368-67da-407d-9a7e-def860134a98/setup-container/0.log" Oct 10 09:46:13 crc kubenswrapper[4732]: I1010 09:46:13.004115 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_3f822368-67da-407d-9a7e-def860134a98/rabbitmq/0.log" Oct 10 09:46:13 crc kubenswrapper[4732]: I1010 09:46:13.836563 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9167b3d7-e83f-4e83-8dfe-a1daa954ea9f/setup-container/0.log" Oct 10 09:46:13 crc kubenswrapper[4732]: I1010 09:46:13.969075 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9167b3d7-e83f-4e83-8dfe-a1daa954ea9f/setup-container/0.log" Oct 10 09:46:14 crc kubenswrapper[4732]: I1010 09:46:14.079029 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_9167b3d7-e83f-4e83-8dfe-a1daa954ea9f/rabbitmq/0.log" Oct 10 09:46:14 crc kubenswrapper[4732]: I1010 09:46:14.281555 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-rg9rf_93fa28a0-a777-44b4-8649-fda723a616d7/reboot-os-openstack-openstack-cell1/0.log" Oct 10 09:46:14 crc kubenswrapper[4732]: I1010 09:46:14.375111 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-gcl2g_b6b83b95-f247-43e6-a1c3-3d3a2cb6cb5b/run-os-openstack-openstack-cell1/0.log" Oct 10 09:46:14 crc kubenswrapper[4732]: I1010 09:46:14.604626 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-scp5h_5d62c0cf-4132-4b71-af85-ca430cab6a8f/ssh-known-hosts-openstack/0.log" Oct 10 09:46:14 crc kubenswrapper[4732]: I1010 09:46:14.840636 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5764484767-24vlx_a653e702-8013-4a30-b236-a5496d1b29e8/proxy-server/0.log" Oct 10 09:46:15 crc kubenswrapper[4732]: I1010 09:46:15.013054 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5764484767-24vlx_a653e702-8013-4a30-b236-a5496d1b29e8/proxy-httpd/0.log" Oct 10 09:46:15 crc kubenswrapper[4732]: I1010 09:46:15.064201 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-swk7w_1d08ea64-6030-4467-964d-f85c284a1a1b/swift-ring-rebalance/0.log" Oct 10 09:46:15 crc kubenswrapper[4732]: I1010 09:46:15.462245 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-b7wxt_ac31623c-6f14-4647-b50e-22c1a6e37741/telemetry-openstack-openstack-cell1/0.log" Oct 10 09:46:15 crc kubenswrapper[4732]: I1010 09:46:15.663274 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_16c8a157-203f-48fc-a30d-bd652be267f5/tempest-tests-tempest-tests-runner/0.log" Oct 10 09:46:15 crc kubenswrapper[4732]: I1010 09:46:15.721769 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_be27b5f6-29b1-45f9-8140-c9a2da177198/test-operator-logs-container/0.log" Oct 10 09:46:15 crc kubenswrapper[4732]: I1010 09:46:15.913145 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-29jxq_2aeb77fe-3b78-467f-a4c1-c383d1ad4d19/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 10 09:46:16 crc kubenswrapper[4732]: I1010 09:46:16.081189 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-8qfmp_deae8388-32f8-45a9-959e-a0b6e758d873/validate-network-openstack-openstack-cell1/0.log" Oct 10 09:46:21 crc kubenswrapper[4732]: I1010 09:46:21.660196 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:46:21 crc kubenswrapper[4732]: E1010 09:46:21.660870 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:46:30 crc kubenswrapper[4732]: I1010 09:46:30.383379 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5044a9c9-6ab0-449a-817a-904207e1dba9/memcached/0.log" Oct 10 09:46:33 crc kubenswrapper[4732]: I1010 09:46:33.666919 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:46:33 crc kubenswrapper[4732]: E1010 09:46:33.667613 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:46:45 crc kubenswrapper[4732]: I1010 09:46:45.660184 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:46:45 crc kubenswrapper[4732]: E1010 09:46:45.662137 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:47:00 crc kubenswrapper[4732]: I1010 09:47:00.661119 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:47:00 crc kubenswrapper[4732]: E1010 09:47:00.663001 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:47:11 crc kubenswrapper[4732]: I1010 09:47:11.660843 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:47:11 crc kubenswrapper[4732]: E1010 09:47:11.662090 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:47:21 crc kubenswrapper[4732]: I1010 09:47:21.810672 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-b54wh_ebf07c0d-be2b-41ec-8363-bdfcc2d3802a/kube-rbac-proxy/0.log" Oct 10 09:47:21 crc kubenswrapper[4732]: I1010 09:47:21.945458 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-658bdf4b74-b54wh_ebf07c0d-be2b-41ec-8363-bdfcc2d3802a/manager/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.049432 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-xt4c5_eb7804d7-814a-4aeb-b9d5-b359cada4441/kube-rbac-proxy/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.195512 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7b7fb68549-xt4c5_eb7804d7-814a-4aeb-b9d5-b359cada4441/manager/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.201887 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-pbxl8_579b1489-9552-485c-92da-5386e7b2afeb/kube-rbac-proxy/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.244938 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-85d5d9dd78-pbxl8_579b1489-9552-485c-92da-5386e7b2afeb/manager/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.374920 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg_bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc/util/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.532232 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg_bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc/pull/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.550556 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg_bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc/util/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.551321 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg_bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc/pull/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.661157 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:47:22 crc kubenswrapper[4732]: E1010 09:47:22.661508 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.735947 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg_bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc/extract/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.758255 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg_bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc/pull/0.log" Oct 10 09:47:22 crc kubenswrapper[4732]: I1010 09:47:22.758425 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f60551a1fb7e79e9788cd8502fdb7f98c74e281ccbf8bf8c740740bbablt8cg_bf5a9a96-f8cf-4fd0-a8e2-d29db90bfcdc/util/0.log" Oct 10 09:47:23 crc kubenswrapper[4732]: I1010 09:47:23.654181 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-9zhd2_455baf3e-9434-4962-93bb-cd6497747fa5/kube-rbac-proxy/0.log" Oct 10 09:47:23 crc kubenswrapper[4732]: I1010 09:47:23.669367 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-gvtth_ac7e4be6-eb30-4fec-bb28-8f7181d7d337/kube-rbac-proxy/0.log" Oct 10 09:47:23 crc kubenswrapper[4732]: I1010 09:47:23.791287 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84b9b84486-9zhd2_455baf3e-9434-4962-93bb-cd6497747fa5/manager/0.log" Oct 10 09:47:23 crc kubenswrapper[4732]: I1010 09:47:23.912771 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-s8qhg_edbb64bd-a7b0-40ea-90e2-7cc1fee46f76/kube-rbac-proxy/0.log" Oct 10 09:47:23 crc kubenswrapper[4732]: I1010 09:47:23.923339 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-858f76bbdd-gvtth_ac7e4be6-eb30-4fec-bb28-8f7181d7d337/manager/0.log" Oct 10 09:47:23 crc kubenswrapper[4732]: I1010 09:47:23.992379 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7ffbcb7588-s8qhg_edbb64bd-a7b0-40ea-90e2-7cc1fee46f76/manager/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.073664 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-nztbg_0da772cc-f90b-4ee7-8793-2fd804249c91/kube-rbac-proxy/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.179076 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-dpw4c_33631d8c-63c6-4912-be80-748b6c997cae/kube-rbac-proxy/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.300456 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-9c5c78d49-dpw4c_33631d8c-63c6-4912-be80-748b6c997cae/manager/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.324208 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-zz2vz_6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b/kube-rbac-proxy/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.379594 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-656bcbd775-nztbg_0da772cc-f90b-4ee7-8793-2fd804249c91/manager/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.497303 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55b6b7c7b8-zz2vz_6a8b2f0b-d16d-4a9a-a3a1-dfc2547a061b/manager/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.552926 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-2knl5_de2d002c-ff31-4c5b-aaa6-9e19c00caf6c/manager/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.586582 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5f67fbc655-2knl5_de2d002c-ff31-4c5b-aaa6-9e19c00caf6c/kube-rbac-proxy/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.665382 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-42jm6_0f893cf0-2c81-455d-a447-b0745e767b18/kube-rbac-proxy/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.721458 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f9fb45f8f-42jm6_0f893cf0-2c81-455d-a447-b0745e767b18/manager/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.759733 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-4hzzt_7ed8853a-5e8a-4dce-abb2-73bc7375a2bb/kube-rbac-proxy/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.828577 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-79d585cb66-4hzzt_7ed8853a-5e8a-4dce-abb2-73bc7375a2bb/manager/0.log" Oct 10 09:47:24 crc kubenswrapper[4732]: I1010 09:47:24.937882 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-s5qjg_48a56683-0762-4720-9640-c2b4e9ffb277/kube-rbac-proxy/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.046577 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5df598886f-s5qjg_48a56683-0762-4720-9640-c2b4e9ffb277/manager/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.055011 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-mx78z_9c145bb1-292a-4675-be8c-9bd49d4034f2/kube-rbac-proxy/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.087530 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69fdcfc5f5-mx78z_9c145bb1-292a-4675-be8c-9bd49d4034f2/manager/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.198584 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84c868ff4cfg79v_d99c70c9-9474-4418-8030-df6d871283e7/kube-rbac-proxy/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.217991 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84c868ff4cfg79v_d99c70c9-9474-4418-8030-df6d871283e7/manager/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.268879 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5698bb9464-8qpcv_6f554597-7d00-422a-b570-834795047cf9/kube-rbac-proxy/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.599777 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-599bffcb5d-5sws5_7fd96c0c-f327-4080-9429-28e5a10b932a/kube-rbac-proxy/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.818654 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-599bffcb5d-5sws5_7fd96c0c-f327-4080-9429-28e5a10b932a/operator/0.log" Oct 10 09:47:25 crc kubenswrapper[4732]: I1010 09:47:25.996459 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-sl8ht_38e49d40-f9b0-476b-a875-891fdb26d8fc/kube-rbac-proxy/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.053318 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-p8tlw_5ed64387-dd36-41ee-82e9-779579474c87/registry-server/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.126037 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-79df5fb58c-sl8ht_38e49d40-f9b0-476b-a875-891fdb26d8fc/manager/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.167575 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-cbf4m_20e2b6da-45d9-40d7-8a93-05cc865543c6/kube-rbac-proxy/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.327811 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-68b6c87b68-cbf4m_20e2b6da-45d9-40d7-8a93-05cc865543c6/manager/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.384812 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-bm5xf_26800fd4-b33e-4bb8-815b-5ec03fc9b22b/operator/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.567746 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-8zx8p_af04a067-8a30-4d2d-a0ff-b3206375d952/manager/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.578965 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-db6d7f97b-8zx8p_af04a067-8a30-4d2d-a0ff-b3206375d952/kube-rbac-proxy/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.653615 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-mhkk2_a0ce9c8b-219c-40da-bc5b-b171446c36ba/kube-rbac-proxy/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.879861 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-j8cfc_9744ec37-6d1a-4b31-b443-26ef804824f3/manager/0.log" Oct 10 09:47:26 crc kubenswrapper[4732]: I1010 09:47:26.905091 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5458f77c4-j8cfc_9744ec37-6d1a-4b31-b443-26ef804824f3/kube-rbac-proxy/0.log" Oct 10 09:47:27 crc kubenswrapper[4732]: I1010 09:47:27.106390 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-26jhx_6913b400-809d-4aa7-b478-999c34cdf0da/kube-rbac-proxy/0.log" Oct 10 09:47:27 crc kubenswrapper[4732]: I1010 09:47:27.135704 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-7f554bff7b-26jhx_6913b400-809d-4aa7-b478-999c34cdf0da/manager/0.log" Oct 10 09:47:27 crc kubenswrapper[4732]: I1010 09:47:27.214895 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-67cfc6749b-mhkk2_a0ce9c8b-219c-40da-bc5b-b171446c36ba/manager/0.log" Oct 10 09:47:27 crc kubenswrapper[4732]: I1010 09:47:27.624652 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5698bb9464-8qpcv_6f554597-7d00-422a-b570-834795047cf9/manager/0.log" Oct 10 09:47:37 crc kubenswrapper[4732]: I1010 09:47:37.664168 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:47:37 crc kubenswrapper[4732]: E1010 09:47:37.664996 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:47:42 crc kubenswrapper[4732]: I1010 09:47:42.806834 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wv8n4_dec63d28-86dc-4410-87a5-b7837f0d7070/control-plane-machine-set-operator/0.log" Oct 10 09:47:43 crc kubenswrapper[4732]: I1010 09:47:43.016891 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z2jfv_3e64e809-d579-480a-bfed-24473604cff0/kube-rbac-proxy/0.log" Oct 10 09:47:43 crc kubenswrapper[4732]: I1010 09:47:43.058910 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-z2jfv_3e64e809-d579-480a-bfed-24473604cff0/machine-api-operator/0.log" Oct 10 09:47:51 crc kubenswrapper[4732]: I1010 09:47:51.660231 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:47:51 crc kubenswrapper[4732]: E1010 09:47:51.661205 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:47:56 crc kubenswrapper[4732]: I1010 09:47:56.863704 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-7rj9k_dd7307fa-d381-41e0-b69f-aa09aeacad83/cert-manager-controller/0.log" Oct 10 09:47:57 crc kubenswrapper[4732]: I1010 09:47:57.240645 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-b6fhc_825885af-0634-426b-92b4-e877ae53c058/cert-manager-cainjector/0.log" Oct 10 09:47:57 crc kubenswrapper[4732]: I1010 09:47:57.314953 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-c7c5k_3db91b6b-e402-459a-b068-700c99ca4552/cert-manager-webhook/0.log" Oct 10 09:48:05 crc kubenswrapper[4732]: I1010 09:48:05.662526 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:48:05 crc kubenswrapper[4732]: E1010 09:48:05.663369 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:48:10 crc kubenswrapper[4732]: I1010 09:48:10.114523 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-lcv95_bc0013a6-b60e-46d4-b9dd-bd692cc42068/nmstate-console-plugin/0.log" Oct 10 09:48:10 crc kubenswrapper[4732]: I1010 09:48:10.359956 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hnbqn_060ec441-2901-4e80-bd22-0b9ece859320/nmstate-handler/0.log" Oct 10 09:48:10 crc kubenswrapper[4732]: I1010 09:48:10.383805 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kd86h_7a16ea47-04cf-4b90-8380-35ad716c299c/kube-rbac-proxy/0.log" Oct 10 09:48:10 crc kubenswrapper[4732]: I1010 09:48:10.406856 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-kd86h_7a16ea47-04cf-4b90-8380-35ad716c299c/nmstate-metrics/0.log" Oct 10 09:48:10 crc kubenswrapper[4732]: I1010 09:48:10.566156 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-sz7pd_e1c7f94b-267b-420a-bd99-7d34c8b02a22/nmstate-operator/0.log" Oct 10 09:48:10 crc kubenswrapper[4732]: I1010 09:48:10.616621 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-7f9qv_4434a14e-fb27-4cfa-adef-4d4f02e5a775/nmstate-webhook/0.log" Oct 10 09:48:16 crc kubenswrapper[4732]: I1010 09:48:16.660841 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:48:16 crc kubenswrapper[4732]: E1010 09:48:16.661530 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:48:24 crc kubenswrapper[4732]: I1010 09:48:24.185829 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xnbhc_118f95da-3c5d-403c-90ff-9a91056fa449/kube-rbac-proxy/0.log" Oct 10 09:48:24 crc kubenswrapper[4732]: I1010 09:48:24.491082 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-frr-files/0.log" Oct 10 09:48:24 crc kubenswrapper[4732]: I1010 09:48:24.690341 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-reloader/0.log" Oct 10 09:48:24 crc kubenswrapper[4732]: I1010 09:48:24.693356 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-frr-files/0.log" Oct 10 09:48:24 crc kubenswrapper[4732]: I1010 09:48:24.720145 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-metrics/0.log" Oct 10 09:48:24 crc kubenswrapper[4732]: I1010 09:48:24.737544 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xnbhc_118f95da-3c5d-403c-90ff-9a91056fa449/controller/0.log" Oct 10 09:48:24 crc kubenswrapper[4732]: I1010 09:48:24.917404 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-reloader/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.117423 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-metrics/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.122565 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-metrics/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.181112 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-reloader/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.189866 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-frr-files/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.311108 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-reloader/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.317036 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-frr-files/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.377251 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/cp-metrics/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.424544 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/controller/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.547791 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/frr-metrics/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.617273 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/kube-rbac-proxy-frr/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.617353 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/kube-rbac-proxy/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.807964 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/reloader/0.log" Oct 10 09:48:25 crc kubenswrapper[4732]: I1010 09:48:25.852060 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-tnccg_5cda66c6-9c63-41d3-b614-16bf38d53346/frr-k8s-webhook-server/0.log" Oct 10 09:48:26 crc kubenswrapper[4732]: I1010 09:48:26.032722 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-686c95bfd-424n7_d510ee8f-515b-4088-8bc2-afb87f7ccf6e/manager/0.log" Oct 10 09:48:26 crc kubenswrapper[4732]: I1010 09:48:26.221533 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cf5cdccc4-5mpts_7f75ce88-78e8-4e65-a7d5-eeb19c049313/webhook-server/0.log" Oct 10 09:48:26 crc kubenswrapper[4732]: I1010 09:48:26.297084 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gchds_fef199bb-8521-40db-9f75-221274c9299d/kube-rbac-proxy/0.log" Oct 10 09:48:27 crc kubenswrapper[4732]: I1010 09:48:27.261322 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gchds_fef199bb-8521-40db-9f75-221274c9299d/speaker/0.log" Oct 10 09:48:28 crc kubenswrapper[4732]: I1010 09:48:28.706678 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-62wtx_9748f00d-5c14-4cde-aea7-6d364ca08325/frr/0.log" Oct 10 09:48:29 crc kubenswrapper[4732]: I1010 09:48:29.660787 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:48:29 crc kubenswrapper[4732]: E1010 09:48:29.661087 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:48:39 crc kubenswrapper[4732]: I1010 09:48:39.828738 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq_643a9f80-158c-4f4c-a1a1-5feac2bac0c1/util/0.log" Oct 10 09:48:39 crc kubenswrapper[4732]: I1010 09:48:39.943326 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq_643a9f80-158c-4f4c-a1a1-5feac2bac0c1/util/0.log" Oct 10 09:48:39 crc kubenswrapper[4732]: I1010 09:48:39.998441 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq_643a9f80-158c-4f4c-a1a1-5feac2bac0c1/pull/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.030768 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq_643a9f80-158c-4f4c-a1a1-5feac2bac0c1/pull/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.207806 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq_643a9f80-158c-4f4c-a1a1-5feac2bac0c1/util/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.234069 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq_643a9f80-158c-4f4c-a1a1-5feac2bac0c1/extract/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.270904 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69hlcpq_643a9f80-158c-4f4c-a1a1-5feac2bac0c1/pull/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.404606 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt_2edbc338-d144-4bd8-a06a-3ea0537ba513/util/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.576989 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt_2edbc338-d144-4bd8-a06a-3ea0537ba513/pull/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.586782 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt_2edbc338-d144-4bd8-a06a-3ea0537ba513/util/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.606520 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt_2edbc338-d144-4bd8-a06a-3ea0537ba513/pull/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.765966 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt_2edbc338-d144-4bd8-a06a-3ea0537ba513/util/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.784392 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt_2edbc338-d144-4bd8-a06a-3ea0537ba513/extract/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.788146 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d24bgzt_2edbc338-d144-4bd8-a06a-3ea0537ba513/pull/0.log" Oct 10 09:48:40 crc kubenswrapper[4732]: I1010 09:48:40.922586 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq_56a273c6-4f36-48f0-83df-2bbe1b6fe6df/util/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.109398 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq_56a273c6-4f36-48f0-83df-2bbe1b6fe6df/util/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.140715 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq_56a273c6-4f36-48f0-83df-2bbe1b6fe6df/pull/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.145153 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq_56a273c6-4f36-48f0-83df-2bbe1b6fe6df/pull/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.346842 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq_56a273c6-4f36-48f0-83df-2bbe1b6fe6df/util/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.348056 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq_56a273c6-4f36-48f0-83df-2bbe1b6fe6df/extract/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.360615 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dptjjq_56a273c6-4f36-48f0-83df-2bbe1b6fe6df/pull/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.495611 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nxp2x_892e4d19-13af-44ac-ad0c-709b2200a088/extract-utilities/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.702271 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nxp2x_892e4d19-13af-44ac-ad0c-709b2200a088/extract-content/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.715764 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nxp2x_892e4d19-13af-44ac-ad0c-709b2200a088/extract-content/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.720772 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nxp2x_892e4d19-13af-44ac-ad0c-709b2200a088/extract-utilities/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.882053 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nxp2x_892e4d19-13af-44ac-ad0c-709b2200a088/extract-utilities/0.log" Oct 10 09:48:41 crc kubenswrapper[4732]: I1010 09:48:41.945113 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nxp2x_892e4d19-13af-44ac-ad0c-709b2200a088/extract-content/0.log" Oct 10 09:48:42 crc kubenswrapper[4732]: I1010 09:48:42.116784 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-krqg8_01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e/extract-utilities/0.log" Oct 10 09:48:42 crc kubenswrapper[4732]: I1010 09:48:42.320604 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-krqg8_01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e/extract-content/0.log" Oct 10 09:48:42 crc kubenswrapper[4732]: I1010 09:48:42.387012 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-krqg8_01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e/extract-utilities/0.log" Oct 10 09:48:42 crc kubenswrapper[4732]: I1010 09:48:42.423480 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-krqg8_01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e/extract-content/0.log" Oct 10 09:48:42 crc kubenswrapper[4732]: I1010 09:48:42.592581 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-krqg8_01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e/extract-utilities/0.log" Oct 10 09:48:42 crc kubenswrapper[4732]: I1010 09:48:42.642985 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-krqg8_01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e/extract-content/0.log" Oct 10 09:48:42 crc kubenswrapper[4732]: I1010 09:48:42.915677 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e/util/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.123574 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e/util/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.209061 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e/pull/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.331239 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e/pull/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.500673 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e/pull/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.513519 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e/util/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.588926 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835cskfhb_76a7c2bc-c1e1-43e2-8398-b1d908b2d00e/extract/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.681255 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nxp2x_892e4d19-13af-44ac-ad0c-709b2200a088/registry-server/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.766903 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-54zqg_40dfbe5b-e4e3-4ab0-a5fb-e4db5ced6c4f/marketplace-operator/0.log" Oct 10 09:48:43 crc kubenswrapper[4732]: I1010 09:48:43.892083 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p868x_9853e548-8347-4b73-b09b-4315d6956d6a/extract-utilities/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.112817 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p868x_9853e548-8347-4b73-b09b-4315d6956d6a/extract-utilities/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.145653 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p868x_9853e548-8347-4b73-b09b-4315d6956d6a/extract-content/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.153300 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p868x_9853e548-8347-4b73-b09b-4315d6956d6a/extract-content/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.156879 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-krqg8_01a60e2a-dce1-4cfe-a977-1a2eb6e18e1e/registry-server/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.325002 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p868x_9853e548-8347-4b73-b09b-4315d6956d6a/extract-content/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.403542 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p868x_9853e548-8347-4b73-b09b-4315d6956d6a/extract-utilities/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.420653 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6prtm_e1b4b052-f1f8-45d4-b837-3acca46f7e39/extract-utilities/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.645394 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6prtm_e1b4b052-f1f8-45d4-b837-3acca46f7e39/extract-content/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.645480 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p868x_9853e548-8347-4b73-b09b-4315d6956d6a/registry-server/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.658360 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6prtm_e1b4b052-f1f8-45d4-b837-3acca46f7e39/extract-utilities/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.660445 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:48:44 crc kubenswrapper[4732]: E1010 09:48:44.660681 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.709334 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6prtm_e1b4b052-f1f8-45d4-b837-3acca46f7e39/extract-content/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.829510 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6prtm_e1b4b052-f1f8-45d4-b837-3acca46f7e39/extract-utilities/0.log" Oct 10 09:48:44 crc kubenswrapper[4732]: I1010 09:48:44.854028 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6prtm_e1b4b052-f1f8-45d4-b837-3acca46f7e39/extract-content/0.log" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.498201 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rcksk"] Oct 10 09:48:45 crc kubenswrapper[4732]: E1010 09:48:45.498626 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba3c27c-e134-4744-92c2-0f195812eebd" containerName="container-00" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.498637 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba3c27c-e134-4744-92c2-0f195812eebd" containerName="container-00" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.498851 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba3c27c-e134-4744-92c2-0f195812eebd" containerName="container-00" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.500334 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.514221 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rcksk"] Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.601142 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-catalog-content\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.601185 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wm25\" (UniqueName: \"kubernetes.io/projected/cca3c58b-99dc-4468-8ce4-883c3630e3dd-kube-api-access-4wm25\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.601347 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-utilities\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.703296 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-utilities\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.704111 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-utilities\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.710824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-catalog-content\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.710867 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wm25\" (UniqueName: \"kubernetes.io/projected/cca3c58b-99dc-4468-8ce4-883c3630e3dd-kube-api-access-4wm25\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.711953 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-catalog-content\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.736267 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wm25\" (UniqueName: \"kubernetes.io/projected/cca3c58b-99dc-4468-8ce4-883c3630e3dd-kube-api-access-4wm25\") pod \"community-operators-rcksk\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:45 crc kubenswrapper[4732]: I1010 09:48:45.823885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:46 crc kubenswrapper[4732]: I1010 09:48:46.053093 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6prtm_e1b4b052-f1f8-45d4-b837-3acca46f7e39/registry-server/0.log" Oct 10 09:48:46 crc kubenswrapper[4732]: I1010 09:48:46.429231 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rcksk"] Oct 10 09:48:46 crc kubenswrapper[4732]: I1010 09:48:46.780168 4732 generic.go:334] "Generic (PLEG): container finished" podID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerID="d50606264f3ceedde33a4a549142eda5377d72efab5efa75864ed7ffd2f35c6a" exitCode=0 Oct 10 09:48:46 crc kubenswrapper[4732]: I1010 09:48:46.780485 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcksk" event={"ID":"cca3c58b-99dc-4468-8ce4-883c3630e3dd","Type":"ContainerDied","Data":"d50606264f3ceedde33a4a549142eda5377d72efab5efa75864ed7ffd2f35c6a"} Oct 10 09:48:46 crc kubenswrapper[4732]: I1010 09:48:46.780515 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcksk" event={"ID":"cca3c58b-99dc-4468-8ce4-883c3630e3dd","Type":"ContainerStarted","Data":"cb58501ea788cf671530f5a0d3c12be56232845439c358dfd2d9e946a0b2c201"} Oct 10 09:48:46 crc kubenswrapper[4732]: I1010 09:48:46.782494 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 10 09:48:47 crc kubenswrapper[4732]: I1010 09:48:47.794775 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcksk" event={"ID":"cca3c58b-99dc-4468-8ce4-883c3630e3dd","Type":"ContainerStarted","Data":"22222f54a017e0925b49af804674bdfe303b70161992e131bce8f8537942dac7"} Oct 10 09:48:48 crc kubenswrapper[4732]: I1010 09:48:48.807714 4732 generic.go:334] "Generic (PLEG): container finished" podID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerID="22222f54a017e0925b49af804674bdfe303b70161992e131bce8f8537942dac7" exitCode=0 Oct 10 09:48:48 crc kubenswrapper[4732]: I1010 09:48:48.807815 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcksk" event={"ID":"cca3c58b-99dc-4468-8ce4-883c3630e3dd","Type":"ContainerDied","Data":"22222f54a017e0925b49af804674bdfe303b70161992e131bce8f8537942dac7"} Oct 10 09:48:49 crc kubenswrapper[4732]: I1010 09:48:49.819587 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcksk" event={"ID":"cca3c58b-99dc-4468-8ce4-883c3630e3dd","Type":"ContainerStarted","Data":"305ff1591244070add0282c0eacb2ee813299db292346f1d39fb75a1f60d1f73"} Oct 10 09:48:49 crc kubenswrapper[4732]: I1010 09:48:49.836496 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rcksk" podStartSLOduration=2.354846078 podStartE2EDuration="4.836479347s" podCreationTimestamp="2025-10-10 09:48:45 +0000 UTC" firstStartedPulling="2025-10-10 09:48:46.782267768 +0000 UTC m=+10653.851858999" lastFinishedPulling="2025-10-10 09:48:49.263901027 +0000 UTC m=+10656.333492268" observedRunningTime="2025-10-10 09:48:49.835926612 +0000 UTC m=+10656.905517873" watchObservedRunningTime="2025-10-10 09:48:49.836479347 +0000 UTC m=+10656.906070598" Oct 10 09:48:55 crc kubenswrapper[4732]: I1010 09:48:55.824124 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:55 crc kubenswrapper[4732]: I1010 09:48:55.824598 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:55 crc kubenswrapper[4732]: I1010 09:48:55.890393 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:55 crc kubenswrapper[4732]: I1010 09:48:55.951494 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:56 crc kubenswrapper[4732]: I1010 09:48:56.127042 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rcksk"] Oct 10 09:48:57 crc kubenswrapper[4732]: I1010 09:48:57.905143 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rcksk" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="registry-server" containerID="cri-o://305ff1591244070add0282c0eacb2ee813299db292346f1d39fb75a1f60d1f73" gracePeriod=2 Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.236126 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-2bz54_b83736e0-1c5a-4160-a56a-eaaa894e994d/prometheus-operator/0.log" Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.482320 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54c4c6765f-gd956_96c3930e-4172-47a4-85fa-d421a85fa25f/prometheus-operator-admission-webhook/0.log" Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.539157 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54c4c6765f-t72cf_43e0e17c-6e37-40b8-9e7f-72c0bb4ceb23/prometheus-operator-admission-webhook/0.log" Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.698687 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-tds2n_3e0db6e5-6938-46a4-91c7-9eb584d07a5d/operator/0.log" Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.829811 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-cwnj8_2f3c5b4f-3be5-47c7-a2af-3e85122d303b/perses-operator/0.log" Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.916667 4732 generic.go:334] "Generic (PLEG): container finished" podID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerID="305ff1591244070add0282c0eacb2ee813299db292346f1d39fb75a1f60d1f73" exitCode=0 Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.916733 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcksk" event={"ID":"cca3c58b-99dc-4468-8ce4-883c3630e3dd","Type":"ContainerDied","Data":"305ff1591244070add0282c0eacb2ee813299db292346f1d39fb75a1f60d1f73"} Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.916765 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rcksk" event={"ID":"cca3c58b-99dc-4468-8ce4-883c3630e3dd","Type":"ContainerDied","Data":"cb58501ea788cf671530f5a0d3c12be56232845439c358dfd2d9e946a0b2c201"} Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.916779 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb58501ea788cf671530f5a0d3c12be56232845439c358dfd2d9e946a0b2c201" Oct 10 09:48:58 crc kubenswrapper[4732]: I1010 09:48:58.936084 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.033092 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-utilities\") pod \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.033403 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-catalog-content\") pod \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.033531 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wm25\" (UniqueName: \"kubernetes.io/projected/cca3c58b-99dc-4468-8ce4-883c3630e3dd-kube-api-access-4wm25\") pod \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\" (UID: \"cca3c58b-99dc-4468-8ce4-883c3630e3dd\") " Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.034074 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-utilities" (OuterVolumeSpecName: "utilities") pod "cca3c58b-99dc-4468-8ce4-883c3630e3dd" (UID: "cca3c58b-99dc-4468-8ce4-883c3630e3dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.034517 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.042834 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca3c58b-99dc-4468-8ce4-883c3630e3dd-kube-api-access-4wm25" (OuterVolumeSpecName: "kube-api-access-4wm25") pod "cca3c58b-99dc-4468-8ce4-883c3630e3dd" (UID: "cca3c58b-99dc-4468-8ce4-883c3630e3dd"). InnerVolumeSpecName "kube-api-access-4wm25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.100971 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cca3c58b-99dc-4468-8ce4-883c3630e3dd" (UID: "cca3c58b-99dc-4468-8ce4-883c3630e3dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.154448 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cca3c58b-99dc-4468-8ce4-883c3630e3dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.155306 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wm25\" (UniqueName: \"kubernetes.io/projected/cca3c58b-99dc-4468-8ce4-883c3630e3dd-kube-api-access-4wm25\") on node \"crc\" DevicePath \"\"" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.661723 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:48:59 crc kubenswrapper[4732]: E1010 09:48:59.662138 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.927023 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rcksk" Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.953251 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rcksk"] Oct 10 09:48:59 crc kubenswrapper[4732]: I1010 09:48:59.963154 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rcksk"] Oct 10 09:49:01 crc kubenswrapper[4732]: I1010 09:49:01.674143 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" path="/var/lib/kubelet/pods/cca3c58b-99dc-4468-8ce4-883c3630e3dd/volumes" Oct 10 09:49:14 crc kubenswrapper[4732]: I1010 09:49:14.660668 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:49:14 crc kubenswrapper[4732]: E1010 09:49:14.661472 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:49:27 crc kubenswrapper[4732]: I1010 09:49:27.661118 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:49:27 crc kubenswrapper[4732]: E1010 09:49:27.662987 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:49:38 crc kubenswrapper[4732]: I1010 09:49:38.660150 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:49:38 crc kubenswrapper[4732]: E1010 09:49:38.660859 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:49:52 crc kubenswrapper[4732]: I1010 09:49:52.661012 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:49:52 crc kubenswrapper[4732]: E1010 09:49:52.661804 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:50:05 crc kubenswrapper[4732]: I1010 09:50:05.660315 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:50:05 crc kubenswrapper[4732]: E1010 09:50:05.661369 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:50:16 crc kubenswrapper[4732]: I1010 09:50:16.660093 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:50:16 crc kubenswrapper[4732]: E1010 09:50:16.660960 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:50:31 crc kubenswrapper[4732]: I1010 09:50:31.662333 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:50:31 crc kubenswrapper[4732]: E1010 09:50:31.663336 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:50:45 crc kubenswrapper[4732]: I1010 09:50:45.661381 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:50:45 crc kubenswrapper[4732]: E1010 09:50:45.662767 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-292kd_openshift-machine-config-operator(1ca39c55-1a82-41b2-b7d5-925320a4e8a0)\"" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" Oct 10 09:50:59 crc kubenswrapper[4732]: I1010 09:50:59.660419 4732 scope.go:117] "RemoveContainer" containerID="a5a73b5050e408f0aa6e3accb10d1c43c259355393e2fa2f9220fc7c977d217b" Oct 10 09:51:00 crc kubenswrapper[4732]: I1010 09:51:00.378393 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-292kd" event={"ID":"1ca39c55-1a82-41b2-b7d5-925320a4e8a0","Type":"ContainerStarted","Data":"ffba88a6b5bdf3fe5b3745070335920388e11625908090de1932645affc33a6c"} Oct 10 09:51:07 crc kubenswrapper[4732]: I1010 09:51:07.461633 4732 generic.go:334] "Generic (PLEG): container finished" podID="10e4f98d-cb01-497a-838a-8308d49241e6" containerID="3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639" exitCode=0 Oct 10 09:51:07 crc kubenswrapper[4732]: I1010 09:51:07.461782 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s2kwv/must-gather-95b45" event={"ID":"10e4f98d-cb01-497a-838a-8308d49241e6","Type":"ContainerDied","Data":"3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639"} Oct 10 09:51:07 crc kubenswrapper[4732]: I1010 09:51:07.462878 4732 scope.go:117] "RemoveContainer" containerID="3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639" Oct 10 09:51:08 crc kubenswrapper[4732]: I1010 09:51:08.378744 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s2kwv_must-gather-95b45_10e4f98d-cb01-497a-838a-8308d49241e6/gather/0.log" Oct 10 09:51:16 crc kubenswrapper[4732]: I1010 09:51:16.559428 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s2kwv/must-gather-95b45"] Oct 10 09:51:16 crc kubenswrapper[4732]: I1010 09:51:16.560242 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-s2kwv/must-gather-95b45" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" containerName="copy" containerID="cri-o://418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45" gracePeriod=2 Oct 10 09:51:16 crc kubenswrapper[4732]: I1010 09:51:16.583777 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s2kwv/must-gather-95b45"] Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.039089 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s2kwv_must-gather-95b45_10e4f98d-cb01-497a-838a-8308d49241e6/copy/0.log" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.040046 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.062903 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psqpb\" (UniqueName: \"kubernetes.io/projected/10e4f98d-cb01-497a-838a-8308d49241e6-kube-api-access-psqpb\") pod \"10e4f98d-cb01-497a-838a-8308d49241e6\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.063087 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10e4f98d-cb01-497a-838a-8308d49241e6-must-gather-output\") pod \"10e4f98d-cb01-497a-838a-8308d49241e6\" (UID: \"10e4f98d-cb01-497a-838a-8308d49241e6\") " Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.070352 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10e4f98d-cb01-497a-838a-8308d49241e6-kube-api-access-psqpb" (OuterVolumeSpecName: "kube-api-access-psqpb") pod "10e4f98d-cb01-497a-838a-8308d49241e6" (UID: "10e4f98d-cb01-497a-838a-8308d49241e6"). InnerVolumeSpecName "kube-api-access-psqpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.166798 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psqpb\" (UniqueName: \"kubernetes.io/projected/10e4f98d-cb01-497a-838a-8308d49241e6-kube-api-access-psqpb\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.268580 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10e4f98d-cb01-497a-838a-8308d49241e6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "10e4f98d-cb01-497a-838a-8308d49241e6" (UID: "10e4f98d-cb01-497a-838a-8308d49241e6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.370631 4732 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/10e4f98d-cb01-497a-838a-8308d49241e6-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.573058 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s2kwv_must-gather-95b45_10e4f98d-cb01-497a-838a-8308d49241e6/copy/0.log" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.574402 4732 generic.go:334] "Generic (PLEG): container finished" podID="10e4f98d-cb01-497a-838a-8308d49241e6" containerID="418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45" exitCode=143 Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.574573 4732 scope.go:117] "RemoveContainer" containerID="418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.574842 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s2kwv/must-gather-95b45" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.613239 4732 scope.go:117] "RemoveContainer" containerID="3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.674042 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" path="/var/lib/kubelet/pods/10e4f98d-cb01-497a-838a-8308d49241e6/volumes" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.704918 4732 scope.go:117] "RemoveContainer" containerID="418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45" Oct 10 09:51:17 crc kubenswrapper[4732]: E1010 09:51:17.706341 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45\": container with ID starting with 418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45 not found: ID does not exist" containerID="418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.706442 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45"} err="failed to get container status \"418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45\": rpc error: code = NotFound desc = could not find container \"418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45\": container with ID starting with 418603067f36787587f8839684d7e1fc0c8bf3d116df29d2191b5a7abc24cb45 not found: ID does not exist" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.706520 4732 scope.go:117] "RemoveContainer" containerID="3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639" Oct 10 09:51:17 crc kubenswrapper[4732]: E1010 09:51:17.706959 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639\": container with ID starting with 3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639 not found: ID does not exist" containerID="3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639" Oct 10 09:51:17 crc kubenswrapper[4732]: I1010 09:51:17.707066 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639"} err="failed to get container status \"3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639\": rpc error: code = NotFound desc = could not find container \"3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639\": container with ID starting with 3671d2eea9aaa007f2cd569e817b85b6db785ae6fc53f9243fb1263c6d805639 not found: ID does not exist" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.862079 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kznmf"] Oct 10 09:51:28 crc kubenswrapper[4732]: E1010 09:51:28.863067 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="extract-content" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863081 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="extract-content" Oct 10 09:51:28 crc kubenswrapper[4732]: E1010 09:51:28.863096 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="extract-utilities" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863102 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="extract-utilities" Oct 10 09:51:28 crc kubenswrapper[4732]: E1010 09:51:28.863139 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" containerName="copy" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863146 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" containerName="copy" Oct 10 09:51:28 crc kubenswrapper[4732]: E1010 09:51:28.863166 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="registry-server" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863172 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="registry-server" Oct 10 09:51:28 crc kubenswrapper[4732]: E1010 09:51:28.863185 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" containerName="gather" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863191 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" containerName="gather" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863377 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca3c58b-99dc-4468-8ce4-883c3630e3dd" containerName="registry-server" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863391 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" containerName="gather" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.863408 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="10e4f98d-cb01-497a-838a-8308d49241e6" containerName="copy" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.864918 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:28 crc kubenswrapper[4732]: I1010 09:51:28.876520 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kznmf"] Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.038911 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-utilities\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.039022 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-852wq\" (UniqueName: \"kubernetes.io/projected/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-kube-api-access-852wq\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.039121 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-catalog-content\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.140409 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-utilities\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.140497 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-852wq\" (UniqueName: \"kubernetes.io/projected/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-kube-api-access-852wq\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.140595 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-catalog-content\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.141054 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-utilities\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.141121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-catalog-content\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.164316 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-852wq\" (UniqueName: \"kubernetes.io/projected/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-kube-api-access-852wq\") pod \"certified-operators-kznmf\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.196078 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:29 crc kubenswrapper[4732]: I1010 09:51:29.702418 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kznmf"] Oct 10 09:51:29 crc kubenswrapper[4732]: W1010 09:51:29.708489 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ff2ab85_c63e_4835_a8f5_c71520ed6e9a.slice/crio-257227d776df52dfef9705c7f6e96d122b8075baaa9ceff56307a98520755afc WatchSource:0}: Error finding container 257227d776df52dfef9705c7f6e96d122b8075baaa9ceff56307a98520755afc: Status 404 returned error can't find the container with id 257227d776df52dfef9705c7f6e96d122b8075baaa9ceff56307a98520755afc Oct 10 09:51:30 crc kubenswrapper[4732]: I1010 09:51:30.741029 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" containerID="279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464" exitCode=0 Oct 10 09:51:30 crc kubenswrapper[4732]: I1010 09:51:30.741267 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kznmf" event={"ID":"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a","Type":"ContainerDied","Data":"279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464"} Oct 10 09:51:30 crc kubenswrapper[4732]: I1010 09:51:30.741292 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kznmf" event={"ID":"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a","Type":"ContainerStarted","Data":"257227d776df52dfef9705c7f6e96d122b8075baaa9ceff56307a98520755afc"} Oct 10 09:51:31 crc kubenswrapper[4732]: I1010 09:51:31.765115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kznmf" event={"ID":"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a","Type":"ContainerStarted","Data":"e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d"} Oct 10 09:51:32 crc kubenswrapper[4732]: I1010 09:51:32.780473 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" containerID="e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d" exitCode=0 Oct 10 09:51:32 crc kubenswrapper[4732]: I1010 09:51:32.780574 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kznmf" event={"ID":"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a","Type":"ContainerDied","Data":"e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d"} Oct 10 09:51:34 crc kubenswrapper[4732]: I1010 09:51:34.807124 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kznmf" event={"ID":"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a","Type":"ContainerStarted","Data":"a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00"} Oct 10 09:51:34 crc kubenswrapper[4732]: I1010 09:51:34.832951 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kznmf" podStartSLOduration=4.013289192 podStartE2EDuration="6.832932112s" podCreationTimestamp="2025-10-10 09:51:28 +0000 UTC" firstStartedPulling="2025-10-10 09:51:30.745916867 +0000 UTC m=+10817.815508108" lastFinishedPulling="2025-10-10 09:51:33.565559737 +0000 UTC m=+10820.635151028" observedRunningTime="2025-10-10 09:51:34.820978217 +0000 UTC m=+10821.890569508" watchObservedRunningTime="2025-10-10 09:51:34.832932112 +0000 UTC m=+10821.902523353" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.044806 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cgd8p"] Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.046936 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.085850 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgd8p"] Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.235899 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-utilities\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.236259 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frprq\" (UniqueName: \"kubernetes.io/projected/f9838cb2-6985-4547-91f3-835654518738-kube-api-access-frprq\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.236337 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-catalog-content\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.339162 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-utilities\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.339213 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frprq\" (UniqueName: \"kubernetes.io/projected/f9838cb2-6985-4547-91f3-835654518738-kube-api-access-frprq\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.339287 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-catalog-content\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.339803 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-catalog-content\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.340041 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-utilities\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.358051 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frprq\" (UniqueName: \"kubernetes.io/projected/f9838cb2-6985-4547-91f3-835654518738-kube-api-access-frprq\") pod \"redhat-marketplace-cgd8p\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.385622 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:38 crc kubenswrapper[4732]: I1010 09:51:38.877785 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgd8p"] Oct 10 09:51:38 crc kubenswrapper[4732]: W1010 09:51:38.894159 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9838cb2_6985_4547_91f3_835654518738.slice/crio-8ff257367ead069e73d79734c3f655459331243866f85e6e6a74aad19e799387 WatchSource:0}: Error finding container 8ff257367ead069e73d79734c3f655459331243866f85e6e6a74aad19e799387: Status 404 returned error can't find the container with id 8ff257367ead069e73d79734c3f655459331243866f85e6e6a74aad19e799387 Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.197425 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.198033 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.280023 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.870547 4732 generic.go:334] "Generic (PLEG): container finished" podID="f9838cb2-6985-4547-91f3-835654518738" containerID="982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056" exitCode=0 Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.873529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgd8p" event={"ID":"f9838cb2-6985-4547-91f3-835654518738","Type":"ContainerDied","Data":"982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056"} Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.873570 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgd8p" event={"ID":"f9838cb2-6985-4547-91f3-835654518738","Type":"ContainerStarted","Data":"8ff257367ead069e73d79734c3f655459331243866f85e6e6a74aad19e799387"} Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.945834 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.948649 4732 scope.go:117] "RemoveContainer" containerID="a3f7f5e32e74e42098b9828a9a936f0553e8fb20162c4e651bbd372518594c72" Oct 10 09:51:39 crc kubenswrapper[4732]: I1010 09:51:39.977341 4732 scope.go:117] "RemoveContainer" containerID="153a960bcc689acf6d2a6cd3039a0a7e0f208cff6b7a74ece489d33d94080680" Oct 10 09:51:41 crc kubenswrapper[4732]: I1010 09:51:41.621493 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kznmf"] Oct 10 09:51:41 crc kubenswrapper[4732]: I1010 09:51:41.894437 4732 generic.go:334] "Generic (PLEG): container finished" podID="f9838cb2-6985-4547-91f3-835654518738" containerID="94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a" exitCode=0 Oct 10 09:51:41 crc kubenswrapper[4732]: I1010 09:51:41.894581 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgd8p" event={"ID":"f9838cb2-6985-4547-91f3-835654518738","Type":"ContainerDied","Data":"94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a"} Oct 10 09:51:41 crc kubenswrapper[4732]: I1010 09:51:41.894675 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kznmf" podUID="6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" containerName="registry-server" containerID="cri-o://a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00" gracePeriod=2 Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.386600 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.565793 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-852wq\" (UniqueName: \"kubernetes.io/projected/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-kube-api-access-852wq\") pod \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.566039 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-utilities\") pod \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.566162 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-catalog-content\") pod \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\" (UID: \"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a\") " Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.566803 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-utilities" (OuterVolumeSpecName: "utilities") pod "6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" (UID: "6ff2ab85-c63e-4835-a8f5-c71520ed6e9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.572955 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-kube-api-access-852wq" (OuterVolumeSpecName: "kube-api-access-852wq") pod "6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" (UID: "6ff2ab85-c63e-4835-a8f5-c71520ed6e9a"). InnerVolumeSpecName "kube-api-access-852wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.619317 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" (UID: "6ff2ab85-c63e-4835-a8f5-c71520ed6e9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.668553 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.668599 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-852wq\" (UniqueName: \"kubernetes.io/projected/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-kube-api-access-852wq\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.668619 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.906190 4732 generic.go:334] "Generic (PLEG): container finished" podID="6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" containerID="a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00" exitCode=0 Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.906281 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kznmf" event={"ID":"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a","Type":"ContainerDied","Data":"a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00"} Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.906295 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kznmf" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.906326 4732 scope.go:117] "RemoveContainer" containerID="a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.906311 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kznmf" event={"ID":"6ff2ab85-c63e-4835-a8f5-c71520ed6e9a","Type":"ContainerDied","Data":"257227d776df52dfef9705c7f6e96d122b8075baaa9ceff56307a98520755afc"} Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.910506 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgd8p" event={"ID":"f9838cb2-6985-4547-91f3-835654518738","Type":"ContainerStarted","Data":"f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101"} Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.931392 4732 scope.go:117] "RemoveContainer" containerID="e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.960769 4732 scope.go:117] "RemoveContainer" containerID="279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.967975 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cgd8p" podStartSLOduration=2.38968329 podStartE2EDuration="4.967947623s" podCreationTimestamp="2025-10-10 09:51:38 +0000 UTC" firstStartedPulling="2025-10-10 09:51:39.875802701 +0000 UTC m=+10826.945393942" lastFinishedPulling="2025-10-10 09:51:42.454067034 +0000 UTC m=+10829.523658275" observedRunningTime="2025-10-10 09:51:42.939976813 +0000 UTC m=+10830.009568064" watchObservedRunningTime="2025-10-10 09:51:42.967947623 +0000 UTC m=+10830.037538904" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.981458 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kznmf"] Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.990411 4732 scope.go:117] "RemoveContainer" containerID="a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00" Oct 10 09:51:42 crc kubenswrapper[4732]: E1010 09:51:42.990865 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00\": container with ID starting with a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00 not found: ID does not exist" containerID="a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.990905 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00"} err="failed to get container status \"a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00\": rpc error: code = NotFound desc = could not find container \"a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00\": container with ID starting with a64b3859908b1323b39c7f8589acf154f2dbd0241eddccb3edde96d1b168fb00 not found: ID does not exist" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.990932 4732 scope.go:117] "RemoveContainer" containerID="e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d" Oct 10 09:51:42 crc kubenswrapper[4732]: E1010 09:51:42.991466 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d\": container with ID starting with e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d not found: ID does not exist" containerID="e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.991507 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d"} err="failed to get container status \"e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d\": rpc error: code = NotFound desc = could not find container \"e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d\": container with ID starting with e22b52af8af49acbea4138359c4199b2cc8d8408d1d5ac56581ee3c77146fd9d not found: ID does not exist" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.991562 4732 scope.go:117] "RemoveContainer" containerID="279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464" Oct 10 09:51:42 crc kubenswrapper[4732]: E1010 09:51:42.991952 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464\": container with ID starting with 279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464 not found: ID does not exist" containerID="279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.991981 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464"} err="failed to get container status \"279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464\": rpc error: code = NotFound desc = could not find container \"279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464\": container with ID starting with 279b14a6dcc57ed90925c14fbf54de87d71e1cf04e3b14fbf6fec4a08027e464 not found: ID does not exist" Oct 10 09:51:42 crc kubenswrapper[4732]: I1010 09:51:42.992591 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kznmf"] Oct 10 09:51:43 crc kubenswrapper[4732]: I1010 09:51:43.672070 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff2ab85-c63e-4835-a8f5-c71520ed6e9a" path="/var/lib/kubelet/pods/6ff2ab85-c63e-4835-a8f5-c71520ed6e9a/volumes" Oct 10 09:51:48 crc kubenswrapper[4732]: I1010 09:51:48.385851 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:48 crc kubenswrapper[4732]: I1010 09:51:48.387504 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:48 crc kubenswrapper[4732]: I1010 09:51:48.461319 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:49 crc kubenswrapper[4732]: I1010 09:51:49.102204 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:49 crc kubenswrapper[4732]: I1010 09:51:49.175237 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgd8p"] Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.029353 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cgd8p" podUID="f9838cb2-6985-4547-91f3-835654518738" containerName="registry-server" containerID="cri-o://f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101" gracePeriod=2 Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.571351 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.679742 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-utilities\") pod \"f9838cb2-6985-4547-91f3-835654518738\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.679848 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frprq\" (UniqueName: \"kubernetes.io/projected/f9838cb2-6985-4547-91f3-835654518738-kube-api-access-frprq\") pod \"f9838cb2-6985-4547-91f3-835654518738\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.679895 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-catalog-content\") pod \"f9838cb2-6985-4547-91f3-835654518738\" (UID: \"f9838cb2-6985-4547-91f3-835654518738\") " Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.681742 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-utilities" (OuterVolumeSpecName: "utilities") pod "f9838cb2-6985-4547-91f3-835654518738" (UID: "f9838cb2-6985-4547-91f3-835654518738"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.700649 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9838cb2-6985-4547-91f3-835654518738" (UID: "f9838cb2-6985-4547-91f3-835654518738"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.722852 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9838cb2-6985-4547-91f3-835654518738-kube-api-access-frprq" (OuterVolumeSpecName: "kube-api-access-frprq") pod "f9838cb2-6985-4547-91f3-835654518738" (UID: "f9838cb2-6985-4547-91f3-835654518738"). InnerVolumeSpecName "kube-api-access-frprq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.782869 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-utilities\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.782921 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frprq\" (UniqueName: \"kubernetes.io/projected/f9838cb2-6985-4547-91f3-835654518738-kube-api-access-frprq\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:51 crc kubenswrapper[4732]: I1010 09:51:51.782940 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9838cb2-6985-4547-91f3-835654518738-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.046338 4732 generic.go:334] "Generic (PLEG): container finished" podID="f9838cb2-6985-4547-91f3-835654518738" containerID="f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101" exitCode=0 Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.046398 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgd8p" event={"ID":"f9838cb2-6985-4547-91f3-835654518738","Type":"ContainerDied","Data":"f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101"} Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.046426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgd8p" event={"ID":"f9838cb2-6985-4547-91f3-835654518738","Type":"ContainerDied","Data":"8ff257367ead069e73d79734c3f655459331243866f85e6e6a74aad19e799387"} Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.046444 4732 scope.go:117] "RemoveContainer" containerID="f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.046595 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgd8p" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.079431 4732 scope.go:117] "RemoveContainer" containerID="94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.089134 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgd8p"] Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.100828 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgd8p"] Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.111654 4732 scope.go:117] "RemoveContainer" containerID="982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.174542 4732 scope.go:117] "RemoveContainer" containerID="f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101" Oct 10 09:51:52 crc kubenswrapper[4732]: E1010 09:51:52.175025 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101\": container with ID starting with f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101 not found: ID does not exist" containerID="f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.175057 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101"} err="failed to get container status \"f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101\": rpc error: code = NotFound desc = could not find container \"f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101\": container with ID starting with f561ed38eea2cfef9428f18d566a9f648772221e3f6e3ea39449819fe0cb8101 not found: ID does not exist" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.175076 4732 scope.go:117] "RemoveContainer" containerID="94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a" Oct 10 09:51:52 crc kubenswrapper[4732]: E1010 09:51:52.175410 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a\": container with ID starting with 94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a not found: ID does not exist" containerID="94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.175457 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a"} err="failed to get container status \"94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a\": rpc error: code = NotFound desc = could not find container \"94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a\": container with ID starting with 94d164febb71be2fe55c3c7c9f3867d73b475b46affe0cd8e0060c0a00f20c9a not found: ID does not exist" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.175485 4732 scope.go:117] "RemoveContainer" containerID="982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056" Oct 10 09:51:52 crc kubenswrapper[4732]: E1010 09:51:52.175753 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056\": container with ID starting with 982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056 not found: ID does not exist" containerID="982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056" Oct 10 09:51:52 crc kubenswrapper[4732]: I1010 09:51:52.175774 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056"} err="failed to get container status \"982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056\": rpc error: code = NotFound desc = could not find container \"982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056\": container with ID starting with 982578c527732fa00ca9848f98a7191873119b8f8eaf03d89fbbfa0c8add0056 not found: ID does not exist" Oct 10 09:51:53 crc kubenswrapper[4732]: I1010 09:51:53.682184 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9838cb2-6985-4547-91f3-835654518738" path="/var/lib/kubelet/pods/f9838cb2-6985-4547-91f3-835654518738/volumes" Oct 10 09:53:25 crc kubenswrapper[4732]: I1010 09:53:25.358949 4732 patch_prober.go:28] interesting pod/machine-config-daemon-292kd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 10 09:53:25 crc kubenswrapper[4732]: I1010 09:53:25.359587 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-292kd" podUID="1ca39c55-1a82-41b2-b7d5-925320a4e8a0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"